I was looking to use the Middleware functionality to intercept calls, and send the event to our own service. Yet, I would like to be able to pass some custom “metadata” into the call. I was thinking just something as simlpe as:
const wrappedLlm = wrapLanguageModel({
model: openai.chat('gpt-4o'),
middleware: [loggingMiddleware]
});
const { text } = await generateText({
model: wrappedLlm,
messages: [{ role: 'user', content: 'Hello' }],
metadata: {
field1: "data 1",
field2: "data 2",
}
});
Is there anything like this? Is there someway i could handle this with the telemetry metadata?
Hey! You could use provider metadata for this:
import { openai } from '@ai-sdk/openai';
import { generateText, wrapLanguageModel, LanguageModelV1Middleware } from 'ai';
import 'dotenv/config';
export const yourLogMiddleware: LanguageModelV1Middleware = {
wrapGenerate: async ({ doGenerate, params }) => {
console.log('doGenerate called');
console.log('METADATA', params.providerMetadata);
const result = await doGenerate();
console.log('doGenerate finished');
console.log(`generated text: ${result.text}`);
return result;
},
};
const wrappedModel = wrapLanguageModel({
model: openai('gpt-3.5-turbo'),
middleware: yourLogMiddleware,
});
async function main() {
const { text, usage } = await generateText({
model: wrappedModel,
prompt: 'Invent a new holiday and describe its traditions.',
providerOptions: {
yourLogMiddleware: {
value1: 'hello',
value2: 'world',
},
},
});
console.log(text);
console.log();
console.log('Usage:', usage);
}
main().catch(console.error);
3 Likes
This is fantastic and exactly what we needed. Thanks!
system
(system)
Closed
April 21, 2025, 11:58am
4
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.