Span attribute inventory
Complete inventory of gen_ai.* and related span attributes consumed by Coralogix AI Center. Use this reference when instrumenting manually or when verifying that spans from your instrumentation library carry the attributes AI Center expects.
How OTEL GenAI semantic conventions work
Every LLM call generates a span representing the request-response lifecycle. The span carries attributes (key-value metadata) that describe what happened:
| Concept | Example |
|---|---|
| Provider | gen_ai.provider.name = "openai" |
| Model | gen_ai.request.model = "gpt-4o" |
| Input | gen_ai.input.messages = [{"role": "user", "parts": [{"type": "text", "content": "Hello"}]}] |
| Output | gen_ai.output.messages = [{"role": "assistant", "parts": [{"type": "text", "content": "Hi!"}]}] |
| Token usage | gen_ai.usage.input_tokens = 12, gen_ai.usage.output_tokens = 8 |
| Operation | gen_ai.operation.name = "chat" |
Span attributes
| Attribute | Type | Description |
|---|---|---|
gen_ai.provider.name | string | LLM provider name (for example, openai, anthropic) |
gen_ai.request.model | string | Model name requested (for example, gpt-4o) |
gen_ai.response.model | string | Model name in response (may differ from request) |
gen_ai.request.temperature | number | Temperature parameter |
gen_ai.usage.input_tokens | number | Input token count |
gen_ai.usage.output_tokens | number | Output token count |
gen_ai.input.messages | JSON array | All input messages as JSON parts array |
gen_ai.output.messages | JSON array | All output messages as JSON parts array |
gen_ai.response.finish_reasons | JSON array | Array of finish reasons |
gen_ai.request.tools | JSON array | Tool definitions in request |
gen_ai.tool.definitions | JSON array | Tool definitions (alternative key) |
gen_ai.tool.name | string | Tool name (on tool-call child spans) |
gen_ai.operation.name | string | Operation type ("chat", "text_completion", or "embeddings") |
gen_ai.system_instructions | string | System instructions sent with the request |
User identity resolution
AI Center resolves user identity from multiple tag keys (checked in order via firstNonNull()):
| Attribute | Source |
|---|---|
gen_ai.request.user | OTel semconv |
enduser.id | OTel semconv |
user.id | OTel semconv |
traceloop.association.properties.user_id | OpenLLMetry |
langsmith.metadata.user_id | LangSmith |
Next steps
Look up which open-source library to use for your provider in Provider compatibility.
Theme
Light