Skip to content

Getting started with AI observability

This guide walks you through integrating Coralogix's AI Observability solution with the OpenAI platform to monitor and gain insights into your LLM applications. By following these steps, you can start sending AI observability data to Coralogix AI Center in a few minutes.

Using an external coding tool like Claude Code or Codex CLI?

This guide is for apps you build and instrument with the llm-tracekit SDK. External developer tools use a direct OTLP integration and appear in a separate screen β€” see Code agents observability for setup instructions.

Note

Check out all our instrumentations in Integrations for LLM observability.

What you need

AI Center processes only trace data, not logs, and retrieves it exclusively from your S3 archive. Data stored in Frequent Search will be ignored. Instrument your observability data as traces and route it to archive storage.

Install the SDK

pip install llm-tracekit[openai]

Set up environment variables

# Coralogix credentials
export CX_TOKEN="your-coralogix-api-key"
export CX_ENDPOINT="your-coralogix-region-endpoint"

# OpenAI API key
export OPENAI_API_KEY="your-openai-api-key"

Create a simple application

Create a new Python file (for example, ai_center_demo.py) using the following code:

import os
from openai import OpenAI
from llm_tracekit import OpenAIInstrumentor, setup_export_to_coralogix

# Configure export to Coralogix
setup_export_to_coralogix(
    service_name="ai-demo-service",
    application_name="ai-demo-app",
    subsystem_name="getting-started"
)

# Instrument OpenAI client
OpenAIInstrumentor().instrument()

# Initialize OpenAI client
client = OpenAI()

def generate_content():
    print("Sending request to OpenAI...")
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Explain what AI observability is in one sentence."},
        ],
    )

    print("\n" + "="*50)
    print("πŸ“ AI RESPONSE:")
    print(f"{response.choices[0].message.content}")
    print("="*50)
    print("\nβœ… Traces have been successfully sent to Coralogix AI Center!")
    print("View your data in the Coralogix AI Center dashboard.\n")

if __name__ == "__main__":
    generate_content()

Run the application

python ai_center_demo.py

Expected output:

Sending request to OpenAI...

==================================================
πŸ“ AI RESPONSE:
AI observability refers to the tools and practices used to monitor, analyze, and understand the behavior and performance of AI models and systems in real-time, ensuring they operate effectively and align with intended outcomes.
==================================================

βœ… Traces have been successfully sent to Coralogix AI Center!
View your data in the Coralogix AI Center dashboard.

View your data in Coralogix AI Center

  1. Log into your Coralogix account.
  2. Go to AI Center, then Application Catalog to see your new service.
  3. Select your application to view its detailed information.
  4. Navigate to the AI Explorer section to see the trace for your request.

Capture tool calls

If your application uses OpenAI's function calling capabilities, these are automatically captured as part of the trace data.

Troubleshoot

Application Catalog and AI Explorer are empty after instrumentation. Cause: AI Center processes only spans produced by the Coralogix llm-tracekit SDK. Spans from other instrumentation libraries β€” even when they include gen_ai.system β€” are ignored. Fix:

  1. Open Spans Explorer and filter by your application.
  2. Confirm the instrumentation library is llm-tracekit. See Integrations for LLM observability for the supported list.
  3. If your application uses a different OpenTelemetry library, switch to the matching llm-tracekit integration.

Spans are visible in Spans Explorer but not in AI Center. Cause: AI Center retrieves data exclusively from your S3 archive. Data stored only in frequent search is ignored. Fix: Configure your S3 archive and confirm AI Center spans are routed to archive storage.

An AI application appears in the service map but not in the Application Catalog. Cause: The service map lists every service that emits any telemetry. The Application Catalog lists only applications instrumented with the llm-tracekit SDK. Fix: Verify the spans come from llm-tracekit, as described in the first item of this section. Once they do, the application appears in the Application Catalog within minutes.

Next steps

Connect additional LLM providers and frameworks with Integrations for LLM observability.