Skip to content

LangChain

Coralogix's AI Observability integrations make it easy to monitor any LangChain-powered application. With a dedicated LangChain integration, Coralogix consolidates spans emitted by OpenAI, Anthropic, AWS Bedrock, and other LangChain chat providers so teams can understand performance, drift, and tool usage without stitching logs across services.

How LangChain works

This library ships production-ready OpenTelemetry instrumentation for LangChain. It automatically hooks into LangChain's callback manager, traces chat and tool-call workflows, and records usage metrics. The integration accelerates debugging, highlights token consumption, and standardizes observability across multi-provider LangChain deployments.

Supported providers

The following providers are supported with full prompt/completion attributes:
ProviderChat model classSystem value
OpenAIChatOpenAIopenai
AnthropicChatAnthropicanthropic
AWS BedrockChatBedrock, ChatBedrockConverse, BedrockChataws.bedrock

Other chat model classes are still instrumented with system value langchain. A span is always created; the model name uses metadata or the provider class name when not available in standard keys.

What you need

  • Python version 3.10 and above.
  • Coralogix API keys.
  • LangChain 1.0.0 or newer along with the provider-specific SDKs (for example langchain-openai or langchain-aws).

Ensure each underlying provider SDK (OpenAI, Anthropic, AWS Bedrock, etc.) is configured with valid credentials before enabling instrumentation.

Installation

Run the following command.

pip install "llm-tracekit-langchain"

Authentication

Exporting spans to Coralogix requires configuring OTLP credentials before enabling instrumentation. Use setup_export_to_coralogix or the corresponding environment variables to supply authentication details.

Using setup_export_to_coralogix

from llm_tracekit.langchain import setup_export_to_coralogix

setup_export_to_coralogix(
    service_name="ai-service",
    application_name="ai-application",
    subsystem_name="ai-subsystem",
    capture_content=True,
)

Using environment variables

If arguments are not passed to setup_export_to_coralogix, the helper reads the following environment variables:

  • CX_TOKEN: Your Coralogix API key.
  • CX_ENDPOINT: Select the ingress.:443 endpoint that corresponds to your Coralogix domain using the domain selector at the top of the page.
  • CX_APPLICATION_NAME: Your application's name.
  • CX_SUBSYSTEM_NAME: Your subsystem's name.

Set up tracing

Instrument

LangChain instrumentation is installed globally by wrapping BaseCallbackManager. Create an instance of LangChainInstrumentor and call instrument before instantiating your chains.

from llm_tracekit.langchain import LangChainInstrumentor

LangChainInstrumentor().instrument()

Uninstrument

To remove instrumentation, call the uninstrument method.

LangChainInstrumentor().uninstrument()

Full example

from llm_tracekit.langchain import LangChainInstrumentor, setup_export_to_coralogix
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage

# Optional: configure sending spans to Coralogix
# Reads connection details from environment variables: CX_TOKEN, CX_ENDPOINT
setup_export_to_coralogix(
    service_name="ai-service",
    application_name="ai-application",
    subsystem_name="ai-subsystem",
    capture_content=True,
)

# Activate instrumentation
LangChainInstrumentor().instrument()

# Example LangChain usage
llm = ChatOpenAI(model="gpt-4o-mini")
messages = [HumanMessage(content="Write a short poem on open telemetry.")]
response = llm.invoke(messages)

# Pass user via config metadata
response = llm.invoke(
    messages,
    config={"metadata": {"user": "[email protected]"}}
)

Enable message content capture

By default, message content — such as prompts, completions, tool call arguments, and tool responses — is not captured.

To capture message content as span attributes, do one of the following:

  • Pass capture_content=True when calling setup_export_to_coralogix.
  • Set the environment variable OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT to true.

Many Coralogix AI evaluations rely on message content; enabling capture is a best practice.

Key differences from OpenTelemetry

Prompts, tool invocations, and model responses are stored as span attributes instead of log events, preserving a single correlated timeline for each LangChain run.

Semantic conventions

AttributeTypeDescriptionExample
gen_ai.operation.namestringThe specific name of the LangChain operation being performedchat
gen_ai.systemstringThe provider or framework responsible for the operationopenai / aws.bedrock / anthropic
gen_ai.request.modelstringThe name of the model requested by the user or applicationgpt-4o-mini
gen_ai.request.temperaturefloatThe temperature parameter passed in the request0.2
gen_ai.request.top_pfloatThe top_p parameter used for nucleus sampling0.95
gen_ai.request.userstringA unique identifier representing the end-user (from config={"metadata": {"user": "..."}})[email protected]
gen_ai.prompt.<message_number>.rolestringRole of the author for each input messageuser
gen_ai.prompt.<message_number>.contentstringContents of the input message (captured when content capture is enabled)Draft a release note for LangChain instrumentation.
gen_ai.prompt.<message_number>.tool_calls.<tool_call_number>.idstringID of a tool call issued from the promptcall_yPIxaozNPCSp1tJ34Hsbdtzg
gen_ai.prompt.<message_number>.tool_calls.<tool_call_number>.typestringType of tool call issued from the promptfunction
gen_ai.prompt.<message_number>.tool_calls.<tool_call_number>.function.namestringFunction name used in the prompt's tool callget_current_weather
gen_ai.prompt.<message_number>.tool_calls.<tool_call_number>.function.argumentsstringArguments passed to the prompt's tool call{"location": "Paris"}
gen_ai.request.tools.<tool_number>.typestringDeclared type for each available tool definition exposed to the modelfunction
gen_ai.request.tools.<tool_number>.function.namestringTool name surfaced in the request-level invocation paramsget_destination_tip
gen_ai.request.tools.<tool_number>.function.descriptionstringTool description shown to the LLMReturn a mock travel tip for the provided city.
gen_ai.request.tools.<tool_number>.function.parametersstringJSON-serialized parameter schema{"type":"object","properties":{"city":{"type":"string"}}}
gen_ai.completion.<choice_number>.rolestringRole of the author for each returned choiceassistant
gen_ai.completion.<choice_number>.finish_reasonstringFinish reason reported for the choicestop
gen_ai.completion.<choice_number>.contentstringText returned by the model (captured when content capture is enabled)Here is the requested release note...
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.idstringID of a tool call triggered by the modelcall_O8NOz8VlxosSASEsOY7LDUcP
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.typestringType of tool call triggered by the modelfunction
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.function.namestringFunction name executed by the model's tool callget_current_weather
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.function.argumentsstringArguments supplied to the model's tool call{"location": "Paris"}
gen_ai.response.modelstringExact model identifier that produced the responsegpt-4o-mini
gen_ai.usage.input_tokensintNumber of tokens consumed by the prompt744
gen_ai.usage.output_tokensintNumber of tokens generated in the response256

LangChain-specific attributes

AttributeTypeDescriptionExample
gen_ai.provider.namestringThe provider name from LangChain metadataopenai, anthropic