Skip to content

Integrating AI Projects into Coralogix

Overview

The Coralogix AI Observability SDK for Python enables the observability of your Python-based AI applications. This library is a modified version of the OpenTelemetry instrumentation for OpenAI, designed for integration with Coralogix AI Center.

Requirements

  • Python version 3.8 and above.
  • Coralogix API keys.

Installation

Run the following command.

pip install llm-tracekit

Authentication

Authentication data is passed during OTel Span Exporter definition:

  1. Select the endpoint associated with your Coralogix domain .
  2. Use your customized API key in the authorization request header.
  3. Provide the application and subsystem names.
from llm_tracekit import setup_export_to_coralogix

setup_export_to_coralogix(
    coralogix_token=<your_coralogix_token>,
    coralogix_endpoint=<your_coralogix_endpoint>,
    service_name="ai-service",
    application_name="ai-application",
    subsystem_name="ai-subsystem",
    capture_content=True,
)

Note

All of the authentication parameters can also be provided through environment variables (CX_TOKEN, CX_ENDPOINT, etc.).

Usage

This library serves as a replacement for the OpenTelemetry OpenAI instrumentation.

Manually set up the OpenTelemetry OpenAI instrumentation, while replacing opentelemetry.instrumentation.openai_v2 with llm_tracekit.

Activation

from llm_tracekit import OpenAIInstrumentor

OpenAIInstrumentor().instrument()

Full example

from llm_tracekit import OpenAIInstrumentor, setup_export_to_coralogix
from openai import OpenAI

# Optional: Configure sending spans to Coralogix
# Reads Coralogix connection details from the following environment variables:
# - CX_TOKEN
# - CX_ENDPOINT
setup_export_to_coralogix(
    service_name="ai-service",
    application_name="ai-application",
    subsystem_name="ai-subsystem",
    capture_content=True,
)

# Activate instrumentation
OpenAIInstrumentor().instrument()

# Example OpenAI Usage
client = OpenAI()
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[
        {"role": "user", "content": "Write a short poem on open telemetry."},
    ],
)

Enabling message content capture

By default, message content, such as the contents of the prompt, completion, function arguments and return values, are not captured. To capture message content as span attributes, set the environment variable OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT to true.

Most Coralogix AI evaluations require message contents to function properly, so enabling message capture is strongly recommended.

Key differences from OpenTelemetry

  • The user parameter in the OpenAI Chat Completions API is captured in the span as the gen_ai.openai.request.user attribute.
  • The tools parameter in the OpenAI Chat Completions API is now captured in the span as the gen_ai.openai.request.tools attribute.
  • User prompts and model responses are captured as span attributes instead of log events, as detailed below.

Semantic conventions

Attribute Type Description Example
gen_ai.openai.request.user string A unique identifier representing the end user [email protected]
gen_ai.prompt.<message_number>.role string Role of message author for user message system, user, assistant, tool
gen_ai.prompt.<message_number>.content string Contents of user message What's the weather in Paris?
gen_ai.prompt.<i>.tool_calls.<tool_call_number>.id string ID of tool call in user message call_O8NOz8VlxosSASEsOY7LDUcP
gen_ai.prompt.<message_number>.tool_calls.<tool_call_number>.type string Type of tool call in user message function
gen_ai.prompt.<message_number>.tool_calls.<tool_call_number>.function.name string The name of the function used in tool call within user message get_current_weather
gen_ai.prompt.<message_number>.tool_calls.<tool_call_number>.function.arguments string Arguments passed to the function used in tool call within user message {"location": "Seattle, WA"}
gen_ai.prompt.<message_number>.tool_call_id string Tool call ID in user message call_mszuSIzqtI65i1wAUOE8w5H4
gen_ai.completion.<choice_number>.role string Role of message author for choice in model response assistant
gen_ai.completion.<choice_number>.finish_reason string Finish reason for choice in model response stop, tool_calls, error
gen_ai.completion.<choice_number>.content string Contents of choice in model response The weather in Paris is rainy and overcast, with temperatures around 57°F
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.id string ID of tool call in choice call_O8NOz8VlxosSASEsOY7LDUcP
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.type string Type of tool call in choice function
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.function.name string The name of the function used in tool call within choice get_current_weather
gen_ai.completion.<choice_number>.tool_calls.<tool_call_number>.function.arguments string Arguments passed to the function used in tool call within choice {"location": "Seattle, WA"}