We get the method to prepare the send message schema / format for the request message using prepareSendMessage() function in useChat(), but I was wondering if there was a similar way for the response messages as well. Currently the backend system doesn’t support text streaming and gives the response in one-go. Would be super helpful if anyone has done it.
I am currently using AI SDK in frontend and for backend using Ollama with FastAPI. Wanted to checkout on text stream protocol but seems like the example has been deprecated as of AI SDK v6.
Request format:
{
query: "" (string)
url: "" (string)
}
Response format:
{
response: "" (string)
references: \[""\] (string\[\])
}