I’m using the useChat hook with a Nuxt site, and with the handleSubmit function I send some extra data which includes the model id I want to use for the llm request that will happen on the nuxt server, this works fine, however I’ve recently added a tool call and if I provide a response via the onToolCall handler from useChat my custom data / modelId is not send again, which causes an error since my server code is looking for this data as part of the request. Is there anyway to include that when the onToolCall provides a response?
Watching my network tab in the browser I see the original request I made sent with the custom data, I then see a send request which is correlated with the onToolCall response being sent once again to my nuxt server api, but the custom not is not provided this time.
// In a server/api/chat/send.post.ts
export default defineEventHandler(async event => {
try {
const body = await readBody(event);
const { messages, data: requestData }: { messages: Message[]; data?: { model?: string } } =
body;
...
const result = await streamText({
model: openRouter(modelId), // I allow the user to to dynamically change the model in the client, so I need to be able to send it to the nuxt server endpoint as part of the payload.
messages,
tools: themeTools,
maxSteps: 5, // Allow multi-step tool usage
onFinish,
});
...
// This is a summarized version of the client setup
const { handleSubmit, ...} = useChat(
await onToolCall({ toolcall }) {
//
return `Tool successfully called`
}
)
// I use the below in my to submit my form and include the custom data
const customHandleSubmit = async (submitEvent?: Event) => {
await handleSubmit(submitEvent, {
data: {
model: selectedModel.value.providerModelId, // This doesn't get sent in request when the tool call responds with a message.
},
});
};
Project framework: Nuxt