[▲ Vercel Community](/) · [Categories](/categories) · [Latest](/latest) · [Top](/top) · [Live](/live) [AI SDK](/c/ai-sdk/62) # Possible to pass any custom metadata in for middleware? 160 views · 3 likes · 3 posts Jamie (@jamie-getlibretto) · 2025-04-11 I was looking to use the Middleware functionality to intercept calls, and send the event to our own service. Yet, I would like to be able to pass some custom "metadata" into the call. I was thinking just something as simlpe as: ``` const wrappedLlm = wrapLanguageModel({ model: openai.chat('gpt-4o'), middleware: [loggingMiddleware] }); const { text } = await generateText({ model: wrappedLlm, messages: [{ role: 'user', content: 'Hello' }], metadata: { field1: "data 1", field2: "data 2", } }); ``` Is there anything like this? Is there someway i could handle this with the telemetry metadata? Nico Albanese (@nicoalbanese) · 2025-04-14 · ♥ 3 Hey! You could use provider metadata for this: ```ts import { openai } from '@ai-sdk/openai'; import { generateText, wrapLanguageModel, LanguageModelV1Middleware } from 'ai'; import 'dotenv/config'; export const yourLogMiddleware: LanguageModelV1Middleware = { wrapGenerate: async ({ doGenerate, params }) => { console.log('doGenerate called'); console.log('METADATA', params.providerMetadata); const result = await doGenerate(); console.log('doGenerate finished'); console.log(`generated text: ${result.text}`); return result; }, }; const wrappedModel = wrapLanguageModel({ model: openai('gpt-3.5-turbo'), middleware: yourLogMiddleware, }); async function main() { const { text, usage } = await generateText({ model: wrappedModel, prompt: 'Invent a new holiday and describe its traditions.', providerOptions: { yourLogMiddleware: { value1: 'hello', value2: 'world', }, }, }); console.log(text); console.log(); console.log('Usage:', usage); } main().catch(console.error); ``` Jamie (@jamie-getlibretto) · 2025-04-14 This is fantastic and exactly what we needed. Thanks!