Skip to content

Gemini CLI integration with Coralogix

Stream token usage, tool calls, API requests, model routing decisions, and session traces from every Gemini CLI session directly into Coralogix. Gemini CLI emits logs, metrics, and traces over OpenTelemetry (OTel) natively—point it at your Coralogix ingress endpoint and data flows automatically.

Supported environments

  • OS: macOS, Linux (*nix)
  • Shell: bash, zsh

What you need

  • A Coralogix account with a Send-Your-Data API key. In Coralogix, navigate to Settings, then API Keys.
  • Your Coralogix OTLP endpoint: ingress.:443. Use the domain selector at the top of this page to select your region.
  • Gemini CLI installed on your machine (npm install -g @google/gemini-cli).

Set up

Install

  1. Install Gemini CLI:

    npm install -g @google/gemini-cli
    
  2. Authenticate with Google:

    gemini
    

    Follow the browser OAuth flow when prompted, or set an API key:

    export GEMINI_API_KEY="your-key-from-aistudio.google.com"
    
  3. Clone the Coralogix AI agent instrumentation repository and navigate to the Gemini CLI directory:

    git clone https://github.com/coralogix/ai-agent-instrumentation.git
    cd ai-agent-instrumentation/gemini-cli
    

Configure

  1. Copy the example environment file:

    cp .env.example .env
    
  2. Open .env and set the following values:

    • CX_API_KEY — your Send-Your-Data API key
    • CX_OTLP_ENDPOINT — your OTLP endpoint (ingress.:443)
  3. Activate the instrumentation and start Gemini CLI:

    source activate.sh
    gemini
    

    Gemini CLI sessions now stream telemetry to Coralogix.

Make it permanent

To activate instrumentation automatically in every new terminal session, add the following line to ~/.zshrc:

source /path/to/gemini-cli/activate.sh
Advanced: set environment variables directly

If you prefer to set the environment variables without the activation script, add the following to ~/.zshrc or ~/.bashrc:

if [ -f "$HOME/path/to/gemini-cli/.env" ]; then
  set -a; source "$HOME/path/to/gemini-cli/.env"; set +a
fi
export GEMINI_TELEMETRY_ENABLED=true
export GEMINI_TELEMETRY_TARGET=local
export GEMINI_TELEMETRY_OTLP_PROTOCOL=grpc
export GEMINI_TELEMETRY_OTLP_ENDPOINT="${CX_OTLP_ENDPOINT}"
export OTEL_EXPORTER_OTLP_HEADERS="authorization=Bearer ${CX_API_KEY},cx-application-name=${CX_APPLICATION_NAME},cx-subsystem-name=${CX_SUBSYSTEM_NAME}"
export OTEL_RESOURCE_ATTRIBUTES="cx.application.name=${CX_APPLICATION_NAME},cx.subsystem.name=${CX_SUBSYSTEM_NAME}"
Alternative: settings.json

Instead of environment variables, configure the endpoint in ~/.gemini/settings.json:

cp settings.json.example ~/.gemini/settings.json

Update the endpoint in the file to match your region. settings.json has no headers field, so OTEL_EXPORTER_OTLP_HEADERS must still be set as an environment variable for authentication to work—even when using this file.

Validate the integration

After running a Gemini CLI session, confirm that data is flowing:

  1. In Coralogix, navigate to Metrics Explorer and search for the metric prefix gemini_cli. Token usage and API request metrics appear within one export interval (~10 seconds).
  2. Navigate to Logs and filter by application name gemini-cli. Search for logRecord.body:"CLI configuration loaded." to find the startup config event.
  3. Navigate to Explore, then Tracing and filter by service name gemini-cli.

Monitor data in Coralogix

Import the dashboard

  1. In Coralogix, navigate to Dashboards, then select New Dashboard, then Import from JSON.
  2. Upload coralogix-gemini-cli-dashboard.json from the cloned repository.

Dashboard sections

SectionWhat you see
OverviewSessions, total tokens, input/output tokens, API errors, tool success rate (24h)
Session activitySessions over time, model distribution
Token usage and efficiencyToken volume by type and model, cache token ratio, thought tokens over time, average tokens per session
API performanceRequests per minute, latency p50/p90/p99, error rate, errors by type, status code distribution
Tool usageTool calls over time, top tools by call count, success rate and latency by function, decision breakdown (accept/reject/auto_accept/modify), Model Context Protocol (MCP) versus native split
File operationsFile operations over time by type, lines changed over time

Configuration examples

Activate prompt logging

activate.sh sets GEMINI_TELEMETRY_LOG_PROMPTS=false by default. Set it to true to include prompt text in log events:

export GEMINI_TELEMETRY_LOG_PROMPTS=true

Note

When using the -p flag (for example, gemini -p "your prompt"), the prompt appears in process.command_args resource attributes regardless of this setting—the OTel Node.js SDK captures all process arguments automatically. Use interactive mode to keep prompt content out of telemetry.

Data reference

Metrics

All metrics appear in Metrics Explorer under the gemini_cli prefix. Coralogix inserts the OTel unit into the Prometheus metric name on ingestion, so the names differ from the OTel spec names.
OTel metricPrometheus name in CoralogixWhat it tracks
gemini_cli.session.countgemini_cli_session_count_totalSessions started
gemini_cli.token.usagegemini_cli_token_usage_totalToken usage by type and model
gemini_cli.api.request.countgemini_cli_api_request_count_totalAPI requests
gemini_cli.api.request.latencygemini_cli_api_request_latency_ms_{bucket,sum,count,max,min}API request latency
gemini_cli.tool.call.countgemini_cli_tool_call_count_totalTool calls
gemini_cli.tool.call.latencygemini_cli_tool_call_latency_ms_{bucket,sum,count,max,min}Tool call latency
gemini_cli.model_routing.latencygemini_cli_model_routing_latency_ms_{bucket,sum,count,max,min}Model routing latency
gemini_cli.file.operation.countgemini_cli_file_operation_count_totalFile operations
gemini_cli.lines.changedgemini_cli_lines_changed_totalLines changed

Log events

Log events use $d.logRecord.attributes['event.name'] for the event name. Filter in DataPrime:

source logs | filter $d.logRecord.attributes['event.name'] == 'gemini_cli.api_error'

For the full signal reference—event names, attributes, and trace span structure—see the Gemini CLI telemetry documentation.

Advanced configuration

VariableDefaultPurpose
GEMINI_TELEMETRY_ENABLEDfalseSet to true to activate telemetry
GEMINI_TELEMETRY_TARGETSet to local for custom OTLP endpoints (not GCP)
GEMINI_TELEMETRY_OTLP_PROTOCOLSet to grpc for Coralogix ingress
GEMINI_TELEMETRY_OTLP_ENDPOINTYour Coralogix OTLP endpoint
GEMINI_TELEMETRY_LOG_PROMPTStrueSet to false to suppress prompt text in log events
OTEL_EXPORTER_OTLP_HEADERSAuth and routing headers (set automatically by activate.sh)

The Gemini CLI settings.json file has no headers field. The @opentelemetry/exporter-*-otlp-grpc packages read the standard OTEL_EXPORTER_OTLP_HEADERS environment variable as gRPC metadata. activate.sh sets this variable with the authorization, cx-application-name, and cx-subsystem-name values.

Permissions

ResourceActionDescription
Send-Your-Data API keyIngest metrics, logs, and tracesRequired to export OTel signals to Coralogix

For details, see Roles and permissions.

Troubleshoot

No data appears in Coralogix Cause: environment variables were not exported into the active shell. Fix: run source activate.sh again in the same terminal you use to run gemini, or add the permanent setup to your shell profile.

Metrics appear but traces do not Cause: GEMINI_TELEMETRY_TARGET is not set to local. Fix: confirm you exported GEMINI_TELEMETRY_TARGET=local and GEMINI_TELEMETRY_OTLP_PROTOCOL=grpc.

Prompts appear in telemetry despite LOG_PROMPTS=false Cause: the -p flag passes the prompt as a process argument, which the OTel Node.js SDK captures automatically. Fix: use interactive mode (gemini, then type your prompt) to keep prompt content out of telemetry.