OpenTelemetry integration for AI Center
OpenTelemetry (OTEL) GenAI semantic conventions define a standardized set of span attributes for observing generative AI applications. Instead of each AI provider or instrumentation library inventing its own telemetry format, these conventions establish a common schema for recording LLM requests, responses, token usage, and tool calls.
By adopting OTEL GenAI conventions, you get:
- Vendor-neutral observability — the same span attributes work regardless of whether you use OpenAI, Anthropic, Google Vertex AI, or another provider.
- Automatic correlation — traces link prompts to completions to downstream tool calls, giving you end-to-end visibility into AI agent workflows.
- Interoperability — any OTEL-compatible collector, backend, or visualization tool can process GenAI spans.
Coralogix AI Center natively ingests and renders GenAI spans. It supports both the legacy semantic convention format (v1.28.0 – v1.36.0) and the newer format (v1.37.0+), so you can send spans from any instrumentation library without migration friction. Spans emitted in the legacy format are still accepted but are not documented here — this doc set focuses on the new format, which is the migration target across the OTEL GenAI ecosystem.
What Coralogix recommends
Instrument your AI application with the OpenTelemetry GenAI semantic conventions. AI Center supports any instrumentation that emits gen_ai.* attributes per the spec — whether you create spans manually, use an OpenTelemetry contrib instrumentation, or use a community library like OpenLLMetry.
Pick the path that fits your stack:
- A library already covers your provider → use it. See Provider compatibility and Framework compatibility for examples of open-source libraries that emit OTEL GenAI semconv.
- No library covers your provider or language → instrument manually. See Span attribute inventory for the attributes AI Center expects.
The libraries surfaced in this doc set are third-party open-source projects, not Coralogix products. Coralogix does not own, maintain, or endorse them. AI Center accepts spans from any of them — and from any custom instrumentation that follows the spec.
Ready to get started?
Jump to a working setup with Code examples — copy-pasteable Python, Java, .NET, and Go scripts that send GenAI spans to Coralogix.
What you need
- An OpenTelemetry-instrumented application — see Provider compatibility and Framework compatibility for the library that fits your stack.
- A Coralogix Send-Your-Data API key.
- Your Coralogix region's OTLP endpoint —
ingress.:443for the region selected at the top of this page.
Use the domain selector at the top of this page to set your Coralogix region. The example commands and code snippets on this page update automatically to use the matching OTLP endpoint.
AI Center processes only trace data, not logs, and retrieves it exclusively from your S3 archive. Data stored in Frequent Search will be ignored. Instrument your observability data as traces and route it to archive storage.
For the complete list of attributes AI Center consumes, see Span attribute inventory.
Set up auto-instrumentation
Step 1: Configure the Coralogix OTLP endpoint
Select your Coralogix region using the domain selector at the top of this page. The OTLP gRPC endpoint resolves automatically — it appears as ingress.:443 in the env-var block below and updates to match the region you pick.
Step 2: Set environment variables
export OTEL_EXPORTER_OTLP_ENDPOINT="https://ingress.:443"
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer <your-api-key>"
export OTEL_SERVICE_NAME="my-ai-service"
export OTEL_RESOURCE_ATTRIBUTES="cx.application.name=my-app,cx.subsystem.name=my-subsystem"
export OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
export OTEL_SEMCONV_STABILITY_OPT_IN=gen_ai_latest_experimental
Step 3: Install auto-instrumentation
The setup varies by language and provider. See Code examples for complete, copy-paste-ready scripts for OpenAI Agents, Anthropic Claude, AWS Bedrock, and more.
If your provider or language lacks an open-source instrumentation library, you can manually create GenAI spans. See Span attribute inventory for the full list of gen_ai.* attributes AI Center consumes.
See Provider compatibility for the full 16-provider × 5-language table and Framework compatibility for agent frameworks and orchestration tools.
Troubleshooting
Spans not appearing in AI Center
Coralogix AI Center filters for GenAI spans using gen_ai.provider.name or gen_ai.input.messages. If neither attribute is set, the span will not appear.
Missing message content
Many libraries do not capture message content by default. Enable it:
# OTel Python contrib
export OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
# OpenLLMetry / Traceloop
export TRACELOOP_TRACE_CONTENT=true
Verifying spans with DataPrime
source spans
| filter tags['gen_ai.provider.name']:string != null
| select $m.traceID,
tags['gen_ai.provider.name']:string,
tags['gen_ai.request.model']:string,
tags['gen_ai.usage.input_tokens']:string,
tags['gen_ai.usage.output_tokens']:string
| limit 10
Common attribute mistakes
| Mistake | Fix |
|---|---|
gen_ai.input.messages set as object, not string | Must be a string containing JSON, not a native object |
Missing role field in messages | Every message must have a role field |
Using gen_ai.model instead of gen_ai.request.model | Correct: gen_ai.request.model (request) or gen_ai.response.model (response) |
Further reading
- OpenTelemetry GenAI Semantic Conventions spec
- OTel Python Contrib GenAI instrumentors
- OpenLLMetry by Traceloop
- Microsoft.Extensions.AI documentation
Next steps
Start sending spans with Code examples — copy-pasteable Python, Java, .NET, and Go scripts for the most common providers.