streamText terminates after tool call when using MCP tools with toUIMessageStreamResponse

// app/api/chat/route.ts
import { convertToModelMessages, streamText, UIMessage } from 'ai';
import { createStreamableUI, createStreamableValue } from '@ai-sdk/rsc';
import { createMCPClient } from '@ai-sdk/mcp';
import { createOpenAI } from '@ai-sdk/openai';

export const maxDuration = 120;

const baseURL = 'http://litellm-gateway:3333';
const apiKey = 'apikey';

export async function POST(req: Request) {
  try {
    const { messages }: { messages: UIMessage[] } = await req.json();

    const gateway = createOpenAI({
      baseURL: `${baseURL}/v1`,
      apiKey: apiKey,
    });

    const mcpClient = await createMCPClient({
      transport: {
        type: 'http',
        url: `${baseURL}/mcp`, 
        headers: { 
          'Authorization': `Bearer ${apiKey}` 
        },
      },
      name: "tools"
    });
    const tools = await mcpClient.tools();

    const result = streamText({
      model: gateway('litellm/gemini-2.0-flash'),
      tools: tools,
      system: `You are a helpful assistant. If the user asks for information you cannot answer directly, call the appropriate tool.`,
      messages: await convertToModelMessages(messages),
      
    });
    return result.toUIMessageStreamResponse();

  } catch (error) {
    console.error("Stream error:", error);
    return new Response("Internal Server Error", { status: 500 });
  }
}

When I check in the network tab, I see the following error:

Error: NS_BASE_STREAM_CLOSED

I also tested this logic using Python with smolagents, and it works correctly when calling tools via the MCP Gateway.

The NS_BASE_STREAM_CLOSED error typically occurs when the MCP client connection is closed prematurely. When you’re using MCP tools with streaming, the client needs to stay open until the entire response completes, if it closes early, you’ll hit this error.

Here’s what I’d recommend checking:

  • Make sure your MCP client isn’t timing out or disconnecting before the stream finishes
  • Verify that your connection handling code keeps the client alive throughout the entire streaming response
  • Check if there are any timeout settings that might be causing the connection to close early

Let me know if you have any other questions!

Thanks Pauline P. Narvas for the response! I’ve been stuck on this for about 3 days now and still can’t resolve it.

I was previously using experimental_createMCPClient, and it worked perfectly. However, since I updated my code to use the standard createMCPClient, I began facing the NS_BASE_STREAM_CLOSED issue mentioned above.

The connection seems to drop the moment the LLM starts streaming the tool output back to the UI. Since I am using streamText with toUIMessageStreamResponse(), I suspect the new createMCPClient handles transport lifecycles differently, causing it to be garbage collected or closed because the main function scope finishes while the stream is still active.

Could you provide an example of the recommended pattern for keeping the createMCPClient alive during an asynchronous stream in a Next.js Route Handler? Is there a specific way to bind the client’s lifetime to the toUIMessageStreamResponse so the transport doesn’t close prematurely?