FluentD

Coralogix provides seamless integration with FluentD so you can send your logs from anywhere and parse them according to your needs.

Prerequisites

Have FluentD installed, for more information on how to implement: FluentD implementation docs

This document includes cluster dependent URL’s. Each URL has a variable part (in Italic). Please match this part with a row entry within the following table. Copy the table row entry located under the column that matches the top level domain of your Coralogix account (.com, .in etc.). Replace the variable part of the URL with this entry.

 .com.us.in
Elasticsearch-APIhttps://coralogix-esapi.coralogix.com:9443https://esapi.coralogix.us:9443https://es-api.app.coralogix.in:9443
SSL Certificateshttps://coralogix-public.s3-eu-west-1.amazonaws.com/certificate/Coralogix-EU.crthttps://www.amazontrust.com/repository/AmazonRootCA1.pemhttps://coralogix-public.s3-eu-west-1.amazonaws.com/certificate/Coralogix-IN
.pem
Cluster URLcoralogix.comcoralogix.usapp.coralogix.in

Usage

You must provide the following four variables when creating a Coralogix logger instance.

Private Key – A unique ID that represents your company, this Id will be sent to your mail once you register to Coralogix.

Application Name – The name of your environment, for example, a company named “SuperData” would probably insert the “SuperData” string parameter or if they want to debug their test environment they might insert the “SuperData– Test”.

SubSystem Name – Your application probably has multiple components, for example, Backend servers, Middleware, Frontend servers, etc. in order to help you examine the data you need, inserting the subsystem parameter is vital.

Installation

td-agent: 

$ td-agent-gem install fluent-plugin-coralogix

Ruby

$ gem install fluent-plugin-coralogix

 

Windows FluentD Install

After Installing FluentD FluentD implementation docs.

Go to your start button and look for td-agent command prompt, right click and start as an admin.

copy past the command below and hit enter:

 

fluent-gem install  fluent-plugin-coralogix_logger

We also provide some scenarios for configuration management systems:

Configuration

Common

Open your Fluentd configuration file and add Coralogix output. If you installed Fluentd using the td-agent packages, the config file is located at /etc/td-agent/td-agent.conf. If you installed Fluentd using the Ruby Gem, the config file is located at /etc/fluent/fluent.conf.
<match **>
  @type coralogix
  privatekey "YOUR_PRIVATE_KEY"
  appname "prod"
  subsystemname "fluentd"
  is_json true
</match>
The first four keys (typeprivatekeyappnamesubsystemname) are mandatory while the last one is optional.

The table below will help you point the the correct endpoint based on your account location.

In the configuration file in the match  section under @type coralogix please add the appropriate line to the configuration.

Account Url Ending with .usAccount Url Ending with .comAccount Url Ending with .in
endpoint "api.coralogix.us"endpoint "api.coralogix.com"endpoint "api.app.coralogix.in"

The table above  will spare you  the add of these environment variables but just in case here they are. 

CORALOGIX_LOG_URL=https://api.Cluster URL/api/v1/logs

CORALOGIX_TIME_DELTA_URL=https://api.Cluster URL/sdk/v1/time

Application and subsystem name

In case your input stream is a JSON object, you can extract APP_NAME and/or SUB_NAME from the JSON using the the $ sign.

appname $APP_NAME_KEY
subsystemname $SUB_NAME_KEY

For instance, with the bellow JSON appname $application will extract testApp into Coralogix applicationName.

{
    "application": "testApp",
    "subsystem": "testSub",
    "code": "200",
    "stream": "stdout",
    "timestamp": "2016-07-20T17:05:17.743Z",
    "message": "hello_world",
}

*Note – nested JSONs are also supported so you can extract into appname and/or subsystemname nested values, e.g. appname$log.application.

Record content

In case your input stream is a JSON object and you don’t want to send the entire JSON, rather just a portion of it, you can add the log_key_name parameter, in your FluentD configuration file–>output section, with the name of the key you want to send. For instance, with the above example, if you write:

log_key_name message

then only the key message will be sent. If you do want to send the entire JSON then you can just delete this parameter from your configuration file.

Timestamp

If you want to use some field as timestamp in Coralogix, you can use timestamp_key_name option:

timestamp_key_name timestamp

then you will see that logs records have timestamp from this field.

Note: We accept only logs that are not older than 24 hours.

JSON support

In case your raw log message is a JSON object you should set is_json key to a true value, otherwise, you can ignore it.

is_json true

Proxy support

This plugin supports sending data via proxy. Here is an example of the configuration:

<match **>
  @type coralogix
  privatekey "YOUR_PRIVATE_KEY"
  appname "prod"
  subsystemname "fluentd"
  is_json true
  <proxy>
    host "PROXY_ADDRESS"
    port PROXY_PORT
    # user and password are optionals parameters
    user "PROXY_USER"
    password "PROXY_PASSWORD"
  </proxy>
</match>

Auto-mapping support

In case your raw log message is a JSON object containing fields with information such as geographic location (lat, lon), DateTime, or Ip address, you may change and add a specific suffix (see followed examples) to the key name using a filter in your configuration (or by using Coralogix parsing rules) so the same field will be automatically mapped as geo-point, date, IP respectively. As a result, you will be able to create a geo-location map visualization, use your log timestamp as the timestamp in range queries, and with Kibana visualization and query IP addresses using the CIDR notation.

E.g. Geographic location

Original log

{
  ...
  "text": "Geo-point data",
  "location": { 
    "lat": 41.12,
    "lon": -71.34
  }
  ...
}

Adding _geopoint suffix to the location object name

{
  ...
  "text": "Geo-point data",
  "location_geopoint": { 
    "lat": 41.12,
    "lon": -71.34
  },
  ...
}
E.g. DateTime

Original log

{
  ...
  "time": "2020-10-13T09:45:33.783441Z",
  ...
}

Adding _custom_timestamp suffix to the time key name

{
 ... 
 "time_custom_timestamp": "2020-10-13T09:45:33.783441Z",
 ... 
}

Note that the time format must be date_optional_time or strict_date_optional_time.

E.g. Ip

Original log

{
  ...
  "ip_addr": "192.168.1.1",
  ...
}

Adding _ipaddr suffix to the location object name

{
  ...
  "ip_addr_ipaddr": "192.168.1.1",
  ...
}