Getting Started with AI Observability
This guide walks you through integrating Coralogix's AI Observability solution with the OpenAI platform to monitor and gain insights into your LLM applications. By following these steps, you can start sending AI observability data to Coralogix AI Center in a few minutes.
Note
For instructions on instrumenting interactions with Bedrock-hosted models, see Amazon Bedrock.
What you need
- Python 3.8 or higher.
- A Coralogix account with a Send-Your-Data API key.
- An OpenAI API key.
AI Center processes only trace data, not logs, and retrieves it exclusively from your S3 archive. Data stored in Frequent Search will be ignored. Instrument your observability data as traces and route it to archive storage.
Install the SDK
Set up environment variables
# Coralogix credentials
export CX_TOKEN="your-coralogix-api-key"
export CX_ENDPOINT="your-coralogix-region-endpoint"
# OpenAI API key
export OPENAI_API_KEY="your-openai-api-key"
Create a simple application
Create a new Python file (for example, ai_center_demo.py) using the following code:
import os
from openai import OpenAI
from llm_tracekit import OpenAIInstrumentor, setup_export_to_coralogix
# Configure export to Coralogix
setup_export_to_coralogix(
service_name="ai-demo-service",
application_name="ai-demo-app",
subsystem_name="getting-started"
)
# Instrument OpenAI client
OpenAIInstrumentor().instrument()
# Initialize OpenAI client
client = OpenAI()
def generate_content():
print("Sending request to OpenAI...")
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain what AI observability is in one sentence."},
],
)
print("\n" + "="*50)
print("📝 AI RESPONSE:")
print(f"{response.choices[0].message.content}")
print("="*50)
print("\nâś… Traces have been successfully sent to Coralogix AI Center!")
print("View your data in the Coralogix AI Center dashboard.\n")
if __name__ == "__main__":
generate_content()
Run the application
Expected output:
Sending request to OpenAI...
==================================================
📝 AI RESPONSE:
AI observability refers to the tools and practices used to monitor, analyze, and understand the behavior and performance of AI models and systems in real-time, ensuring they operate effectively and align with intended outcomes.
==================================================
âś… Traces have been successfully sent to Coralogix AI Center!
View your data in the Coralogix AI Center dashboard.
View your data in Coralogix AI Center
- Log into your Coralogix account.
- Go to AI Center, then Application Catalog to see your new service.
- Select your application to view its detailed information.
- Navigate to the AI Explorer section to see the trace for your request.
Capture tool calls
If your application uses OpenAI's function calling capabilities, these are automatically captured as part of the trace data.
Next steps
- Integrations — Connect LLM providers and frameworks including OpenAI, LangChain, Bedrock, and more.
- Getting Started with Guardrails — Add real-time policy enforcement to your application.