Integrations for LLM observability
Coralogix AI Observability integrations give you deep insight into your AI applications — monitor, analyze, and optimize performance across LLM providers, frameworks, and code agents. Get end-to-end visibility into AI workloads with proactive issue detection and efficient performance tuning.
LLM providers and frameworks
- Amazon Bedrock
- Anthropic
- Gemini
- Google ADK
- LangChain
- LangGraph
- LiteLLM
- Mastra
- Microsoft Foundry
- OpenAI
- OpenAI Agents SDK
- Strands Agents
Code agents
- Claude — hub for Claude Code and Claude Cowork
- Claude Code
- Claude Cowork
- Codex CLI
- Cursor
- Gemini CLI
- OpenClaw
AI discovery
Next steps
Start monitoring your AI applications with Monitor AI applications.
Theme
Light