Issue
The Vercel AI SDK currently uses custom attributes under the ai.* namespace for telemetry data, which differs from the OpenTelemetry Gen AI Semantic Conventions.
Current implementation:
{
"ai.prompt.messages": "[{\"role\":\"user\",\"content\":\"...\"}]",
"ai.response.text": "...",
"ai.prompt.tools": [...]
}
OTel standard:
{
"gen_ai.input.messages": [{...}],
"gen_ai.output.messages": [{...}],
"gen_ai.tool.definitions": [...]
}
Impact
- Third-party observability tools (e.g., Phoenix, Langfuse) cannot auto-extract LLM I/O without custom adapters
- Messages are stored as JSON strings instead of structured objects
- Span kind is
UNKNOWNinstead of semantic types likeCLIENT
Question
Is there a plan to adopt the OpenTelemetry Gen AI Semantic Conventions? This would improve interoperability across the LLM observability ecosystem.