Headed to Las Vegas for AWS re:Invent? Come visit us at booth #118!

Streama© Technology

How It Works

The Coralogix Platform leverages Streama© technology to provide real-time insights and long-term trend analysis with no reliance on storage or indexing.

green checkmark icon

Distributed Persistence Data Layer and Fault Tolerance

green checkmark icon

Robust integration logic with batching and backpressure

green checkmark icon

Non-blocking IO for concurrent data processing

green checkmark icon

Advanced auto-scaling using K8s HPA, VPA, and Keda

green checkmark icon

Zero data loss and 99.999% service uptime

green checkmark icon

Cost optimization with savings of up to 70%

What is Streama©?

Streama© is the foundation of our stateful streaming data platform. It’s based on our 3 “S” architecture – source, stream, and sink – using Kafka streams built within Kubernetes clusters and running fully as SaaS.

Source

Data is ingested from any external source using Kafka Connect which produces events and state-storage to Kafka topics and k-tables.

As data flows change, the Coralogix Platform automatically scales up and down according to CPU, RAM, data volumes, latency, and more in order to provide a smooth and performant experience with 99.999% uptime at any scale.

source diagram
logs metrics security data

Logs, Metrics & Security Events

Event data is seamlessly collected from hundreds of sources for a single, aggregated view of system health.

event source enrichment

Event Source Enrichment

Event enrichment sources are ingested and correlated with event data to ensure that all pertinent information is being collected.

contextual data

Contextual Data Collection

Third-party data sources such as status pages, cloud availability reports, CI/CD platforms & more are leverage to provide context around how they affect your production.

Stream

Events flow to Kafka for stream analysis and are automatically parsed, enriched, and clustered using machine learning algorithms.

Streama©’s stateful streaming uses dedicated Kafka topics to store and enrich the main events flow with a state. This powers real-time monitoring and insights and also addresses the need to track long-term trends.

stream diagram
real time event transformation

RT Event Transformation

Data is ingested and immediately enters the parsing engine which executes regex rules to parse, mask, extract or block data without any pre-configurations. Data can then be enriched using pre-built sources or the customer database.

live event monitoring icon

Live Event Monitoring

All event data from all servers can be monitored in Coralogix’s UI or in your own terminal via CLI with less than a 5-second latency. Events in the live tail can be filtered by app or subsystem or according to any |grep/text/regex query.

optimized storage icon

Optimized Storage Routing

Compliance data can be identified and archived at a minimal cost. The rest of the data runs through the monitoring engine and is then sent to archive or to hot storage (only for data that is frequently searched).

event clustering icon

Event Clustering

Machine-learning algorithms cluster countless individual events into a finite number of templates to enable monitoring of common events and the identification of anomalies.

metric generation icon

Metric Generation

Event data can be aggregated into metrics via standard query. Aggregated metrics are stored for 12 months for no added cost.

automated insights icon

Automated Insights

Machine learning algorithms learn the typical flow of data and identify suspected errors based on correlated events, including abnormal spikes and log ratios.

security traffic analyzer

Security Traffic Analyzer (STA)

Traffic mirroring via AWS VPC provides access to all data being transferred between your servers and other cloud-based infrastructure.

dynamic alerting icon

Dynamic Alerting

Real-time, low latency alerts of various criteria, manual or ML-powered are triggered for logs, metrics, and security information.

full rbac icon

Full RBAC

All streaming analytics are available per specifically defined user groups and permissions.

Sink

Data and insights are sent to any external source once it’s passed through the stream analysis engine.

This includes external storage locations such as an S3 bucket or the Coralogix-hosted hot storage as well as third-party visualization and alerting services.

sink diagram
data forwarding icon

Data Forwarding

Data can be sent to external long-term storage in readable TSV format that can be also directly queried from Coralogix. Archived event data can be reindexed via direct query at any time.

visualization and alerting icon

Visualization & Alerting

All events, aggregations, and insights can be sent for visualization in our purpose-built UI, Kibana, Grafana, SQL clients, Tableau, and more.

apis icon

APIs / CLI

Using the Coralogix CLI and full API support, events data can easily be exported to any third-party tool or external location.

Learn More

Video

What's next for Streama?

Watch Now

Blog Post

Streama: Get complete monitoring coverage without paying for the noise

Read Now

Blog Post

What is the Coralogix Security Traffic Analyzer (STA), and Why Do I Need It?

Read Now

Stateful streaming analytics for observability data