Skip to content

Strands Agents

Monitor applications built with the Strands Agents SDK using Coralogix AI Observability. The Strands integration enriches the built-in OpenTelemetry tracing provided by the SDK with GenAI semantic convention attributes, so you can track prompt content, completions, tool calls, and user identity across every agent run.

How Strands instrumentation works

The Strands Agents SDK ships with built-in OpenTelemetry tracing. The llm-tracekit-strands package enriches those existing spans with additional GenAI semantic convention attributes, covering prompt and completion content, tool call metadata, finish reasons, and end-user identity. You don't need to create spans manually.

What you need

  • Python 3.10–3.13.
  • Coralogix API keys.
  • strands-agents 0.1.0 or newer.

Installation

Install the package:

pip install "llm-tracekit-strands"

Authentication

Configure OTLP credentials before enabling instrumentation to export spans to Coralogix. Use setup_export_to_coralogix or the corresponding environment variables.

Using setup_export_to_coralogix

from llm_tracekit.strands import setup_export_to_coralogix

setup_export_to_coralogix(
    service_name="ai-service",
    application_name="ai-application",
    subsystem_name="ai-subsystem",
    capture_content=True,
)

Manual OTel setup

Configure a tracer provider directly using the standard OpenTelemetry SDK:

from opentelemetry import trace
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import SERVICE_NAME, Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import SimpleSpanProcessor

tracer_provider = TracerProvider(
    resource=Resource.create({SERVICE_NAME: "ai-service"}),
)
exporter = OTLPSpanExporter()
span_processor = SimpleSpanProcessor(exporter)
tracer_provider.add_span_processor(span_processor)
trace.set_tracer_provider(tracer_provider)

Using environment variables

If arguments are not passed to setup_export_to_coralogix, the helper reads the following environment variables:

  • CX_TOKEN: Your Coralogix API key.
  • CX_ENDPOINT: Select the ingress.:443 endpoint that corresponds to your Coralogix domain using the domain selector at the top of the page.
  • CX_APPLICATION_NAME: Your application's name.
  • CX_SUBSYSTEM_NAME: Your subsystem's name.

Set up tracing

Instrument

Create an instance of StrandsInstrumentor and call instrument before running your agent.

from llm_tracekit.strands import StrandsInstrumentor

StrandsInstrumentor().instrument()

Uninstrument

To remove instrumentation, call the uninstrument method.

StrandsInstrumentor().uninstrument()

Full example

from llm_tracekit.strands import StrandsInstrumentor, setup_export_to_coralogix
from strands import Agent

# Optional: configure sending spans to Coralogix
# Reads connection details from environment variables: CX_TOKEN, CX_ENDPOINT
setup_export_to_coralogix(
    service_name="ai-service",
    application_name="ai-application",
    subsystem_name="ai-subsystem",
    capture_content=True,
)

# Activate instrumentation
StrandsInstrumentor().instrument()

# Example Strands usage
agent = Agent(system_prompt="You are a helpful assistant.")
response = agent("Write a short poem on open telemetry.")

Enable message content capture

By default, message content—such as prompts, completions, tool call arguments, and tool responses—is not captured.

To capture message content as span attributes, do one of the following:

  • Pass capture_content=True when calling setup_export_to_coralogix.
  • Set the environment variable OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT to true.

Coralogix recommends enabling message content capture. Many AI evaluations will not work without it.

User identification

To associate traces with a specific end user, pass the user parameter in your model's params configuration:

from strands.models.openai import OpenAIModel
from strands import Agent

model = OpenAIModel(
    model_id="gpt-4o",
    params={"user": "[email protected]"}
)

agent = Agent(model=model)

This sets the gen_ai.request.user span attribute on every request made by that agent.

Validate the integration

After running an instrumented agent, open AI Center in Coralogix. Agent runs appear as spans in the LLM Calls view. Confirm that:

  • Spans are associated with the correct application and subsystem names.
  • Prompt and completion content appears if message content capture is enabled.
  • Tool calls, finish reasons, and user identifiers populate as expected.

Semantic conventions

AttributeTypeDescriptionExample
gen_ai.prompt.<message_number>.rolestringRole of the author for each input messagesystem, user, assistant, tool
gen_ai.prompt.<message_number>.contentstringContents of the input message (captured when content capture is enabled)What's the weather in Paris?
gen_ai.prompt.<message_number>.tool_calls.<tool_call_number>.idstringID of a tool call issued from the promptcall_O8NOz8VlxosSASEsOY7LDUcP
gen_ai.prompt.<message_number>.tool_calls.<tool_call_number>.typestringType of tool call issued from the promptfunction
gen_ai.prompt.<message_number>.tool_calls.<tool_call_number>.function.namestringFunction name used in the prompt's tool callget_current_weather
gen_ai.prompt.<message_number>.tool_calls.<tool_call_number>.function.argumentsstringArguments passed to the prompt's tool call{"location": "Seattle, WA"}
gen_ai.prompt.<message_number>.tool_call_idstringID of the tool call result returned in a tool messagecall_mszuSIzqtI65i1wAUOE8w5H4
gen_ai.completion.<choice_number>.rolestringRole of the author for each returned choiceassistant
gen_ai.completion.<choice_number>.finish_reasonstringFinish reason reported for the choicestop, tool_calls, error
gen_ai.completion.<choice_number>.contentstringText returned by the model (captured when content capture is enabled)The weather in Paris is rainy and overcast...
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.idstringID of a tool call triggered by the modelcall_O8NOz8VlxosSASEsOY7LDUcP
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.typestringType of tool call triggered by the modelfunction
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.function.namestringFunction name executed by the model's tool callget_current_weather
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.function.argumentsstringArguments supplied to the model's tool call{"location": "Seattle, WA"}
gen_ai.request.tools.<tool_number>.typestringDeclared type for each available tool definition exposed to the modelfunction
gen_ai.request.tools.<tool_number>.function.namestringTool name surfaced in the request-level invocation paramsget_current_weather
gen_ai.request.tools.<tool_number>.function.descriptionstringTool description shown to the modelGet the current weather in a given location
gen_ai.request.tools.<tool_number>.function.parametersstringJSON-serialized parameter schema{"type":"object","properties":{"location":{"type":"string"}},"required":["location"]}
gen_ai.request.userstringA unique identifier representing the end-user[email protected]

Next steps

  • AI Center Overview — monitor performance, costs, quality issues, and security across all AI applications
  • Evaluate — set up evaluation policies to assess prompt and response quality
  • LangChain integration — add LangChain tracing alongside Strands