[▲ Vercel Community](/) · [Categories](/categories) · [Latest](/latest) · [Top](/top) · [Live](/live)

[AI SDK](/c/ai-sdk/62)

# Vercel AI sdk update

408 views · 1 like · 2 posts


EC (@gpslec88-gmailcom) · 2024-11-22

Hi,

I am upgrading a personal project from AI SDK 3.0 to 3.4.  Pretty new to programming and am trying to learn by example.

I have replaced StreamingTextResponse and can successfully stream a completion back to the client using the below code.


```
 const result = await streamText({
             model: azure2('gpt4o'),
             prompt: 'Write a vegetarian lasagna recipe for 4 people.',
           });
 return result.toDataStreamResponse();
```

My question is how would I go about streaming some static text / string (which is not from a completion).   I previously had the below code as an example.


```
  const stream = Readable.from("hello");
         
 return new Response(stream, {
           headers: {
             createdAt: new Date().toISOString(),     
           },
         });
```

Thank you very much.


Nico Albanese (@nicoalbanese) · 2024-11-26 · ♥ 1

Hey! Are you using `useChat` on the frontend? Take a look at [`streamData`](https://sdk.vercel.ai/docs/ai-sdk-ui/streaming-data#streaming-data)

btw, with 4.0, `streamText` is no longer blocking so you can remove the `await` 😊

this is great for streamData as it will mean it won't be blocked by the model's response.