[▲ Vercel Community](/) · [Categories](/categories) · [Latest](/latest) · [Top](/top) · [Live](/live)

[AI SDK](/c/ai-sdk/62)

# OpenTelemetry Gen AI Semantic Conventions Support

93 views · 0 likes · 1 post


zxzinn (@zxzinn) · 2025-12-13

# Issue

The Vercel AI SDK currently uses custom attributes under the `ai.*` namespace for telemetry data, which differs from the [OpenTelemetry Gen AI Semantic Conventions](https://opentelemetry.io/docs/specs/semconv/gen-ai/).

**Current implementation:**

```json
{
  "ai.prompt.messages": "[{\"role\":\"user\",\"content\":\"...\"}]",
  "ai.response.text": "...",
  "ai.prompt.tools": [...]
}
```

**OTel standard:**

```json
{
  "gen_ai.input.messages": [{...}],
  "gen_ai.output.messages": [{...}],
  "gen_ai.tool.definitions": [...]
}
```

## Impact

* Third-party observability tools (e.g., Phoenix, Langfuse) cannot auto-extract LLM I/O without custom adapters
* Messages are stored as JSON strings instead of structured objects
* Span kind is `UNKNOWN` instead of semantic types like `CLIENT`

## Question

Is there a plan to adopt the OpenTelemetry Gen AI Semantic Conventions? This would improve interoperability across the LLM observability ecosystem.

## References

* [OpenTelemetry Gen AI Semantic Conventions](https://opentelemetry.io/docs/specs/semconv/gen-ai/)
* [Gen AI Spans Specification](https://opentelemetry.io/docs/specs/semconv/gen-ai/gen-ai-spans/)
* [OpenInference Semantic Conventions](https://github.com/Arize-ai/openinference/blob/main/spec/semantic_conventions.md)