Ai Gateway

I’m seeing an issue enabling reasoning/thinking when routing requests through Vercel’s AI provider for the model “xiaomi mimo v2 flash”.
Summary
When I set either enable_thinking: true or reasoning: { enabled: true } in the request, reasoning does not appear to be enabled (no reasoning/thinking output, no behavior change). However, using the provider directly, reasoning works as expected. The same is true when using OpenRouter—reasoning works there as well. This makes it look like the reasoning-related parameters are being ignored or not passed through by Vercel for this model.

Current vs Expected behavior
Current behavior
• Requests via Vercel: reasoning/thinking does not activate even when explicitly enabled via:
• enable_thinking: true, or
• reasoning: { enabled: true }
• No observable difference compared to requests without those parameters.
Expected behavior
• When enabling reasoning via the above parameters, the model should enter reasoning mode (matching behavior seen when calling the provider directly and via OpenRouter).

Code, configuration, and steps to reproduce
1. Use Vercel AI provider with model xiaomi mimo v2 flash.
2. Send a request with either:
• enable_thinking: true, or
• reasoning: { enabled: true }
3. Compare the response/behavior to:
• the same request sent directly to the model provider, and/or
• the same request sent through OpenRouter
Result: reasoning works in the direct-provider and OpenRouter paths, but not via Vercel.
Minimal example request payload (illustrative)
• Model: xiaomi mimo v2 flash
• Parameters tried:
• enable_thinking: true
• reasoning: { enabled: true }

Hi @enzorb, welcome to the Vercel Community!

I’m sorry you are facing this issue.

Can you share the code snippet and the link to docs you’ve been following so far?

Hi! Thanks, i’m sharing both.

For clarity, here’s an OpenAI-compatible snippet that reproduces the issue. I’m using the same approach in production conceptually (same endpoint + same payload shape), but the actual client is OpenWebUI configured to use Vercel AI Gateway at https://ai-gateway.vercel.sh/v1.

OpenAI-compatible repro:

// Node 18+
// npm i openai

import OpenAI from “openai”;

const client = new OpenAI({
apiKey: process.env.VERCEL_AI_GATEWAY_API_KEY,
baseURL: “https://ai-gateway.vercel.sh/v1”,
});

async function main() {
const res = await client.chat.completions.create({
model: “xiaomi/mimo-v2-flash”,
messages: [
{
role: “user”,
content:
“Solve this step by step: If a train travels 120km in 80 minutes, what is its average speed in km/h?”,
},
],

// Tried these (separately):
// enable_thinking: true,
reasoning: { enabled: true },

});

console.log(res.choices?.[0]?.message?.content);
}

main().catch(console.error);

Current behavior: through the gateway, reasoning: { enabled: true } (and also enable_thinking: true) doesn’t appear to enable reasoning for xiaomi/mimo-v2-flash (no observable behavior change vs reasoning-off).

Expected: reasoning should enable like it does when calling the provider directly, and also via OpenRouter (where it works for the same model).

Vercel docs I’ve been following (reasoning): OpenAI-Compatible API

the OpenAI-compatible API docs describe “Reasoning configuration” via a reasoning object (e.g. enabled, budget_tokens, effort, exclude).