[▲ Vercel Community](/) · [Categories](/categories) · [Latest](/latest) · [Top](/top) · [Live](/live)

[AI SDK](/c/ai-sdk/62)

# How to Provide chat message cost without blocking Streaming?

122 views · 0 likes · 1 post


17swagat 9326 (@17swagat-9326) · 2025-11-10

I'm using the **Vercel AI SDK** (`ai` package) to stream responses from a model, and I want to include extra provider metadata — specifically the `cost` — in the final `finish` metadata sent to the client.

Here’s the simplified version of my code:

```typescript
const result = streamText({
    prompt,
    model: gateway(ai_model)}
);

const providerMetadataPromise = result.providerMetadata;

return result.toUIMessageStreamResponse({
  originalMessages: messages,
  sendReasoning: true,
  messageMetadata: ({ part }): Record<string, string> | undefined => {
    if (part.type === 'start') {
      return { model: ai_model };
    }

    if (part.type === 'finish') {
      let answerCost: string;
      providerMetadataPromise.then(data => {
        answerCost = (data!.gateway as any).cost;
      });
      return { model: ai_model, cost: answerCost! };
    }
  },
});

```

* If I use `await result.providerMetadata` **before** streaming starts, I get the cost correctly — but it blocks the start of the stream.

* If I remove the `await` and use `.then()`, streaming starts instantly — but the cost in the `finish` metadata is always `undefined` on the client.

* I also tried making `messageMetadata` `async`, but it seems the SDK doesn’t await it, so the cost never arrives.

---

I want:

* The streaming to start immediately (no delay).

* The final `finish` metadata to contain both `{ model, cost }` values.

* The client to actually receive the cost along with the last message.

---

Is there a correct, non-blocking way to include async provider metadata (like cost) in the `finish` metadata for a streaming response in the Vercel AI SDK?