Whether you are just starting your observability journey or already are an expert, our courses will help advance your knowledge and practical skills.
Expert insight, best practices and information on everything related to Observability issues, trends and solutions.
Explore our guides on a broad range of observability related topics.
The Coralogix Platform leverages Streama© technology to provide real-time insights and long-term trend analysis with no reliance on storage or indexing.
Distributed Persistence Data Layer and Fault Tolerance
Robust integration logic with batching and backpressure
Non-blocking IO for concurrent data processing
Advanced auto-scaling using K8s HPA, VPA, and Keda
Zero data loss and 99.99% service uptime
Cost optimization with savings of up to 70%
Streama© is the foundation of our stateful streaming data platform. It’s based on our 3 “S” architecture – source, stream, and sink – using Kafka streams built within Kubernetes clusters and running fully as SaaS.
Data is ingested from any external source using Kafka Connect which produces events and state-storage to Kafka topics and k-tables.
As data flows change, the Coralogix Platform automatically scales up and down according to CPU, RAM, data volumes, latency, and more in order to provide a smooth and performant experience with 99.99% uptime at any scale.
Event data is seamlessly collected from hundreds of sources for a single, aggregated view of system health.
Event enrichment sources are ingested and correlated with event data to ensure that all pertinent information is being collected.
Third-party data sources such as status pages, cloud availability reports, CI/CD platforms & more are leverage to provide context around how they affect your production.
Events flow to Kafka for stream analysis and are automatically parsed, enriched, and clustered using machine learning algorithms.
Streama©’s stateful streaming uses dedicated Kafka topics to store and enrich the main events flow with a state. This powers real-time monitoring and insights and also addresses the need to track long-term trends.
Data is ingested and immediately enters the parsing engine which executes regex rules to parse, mask, extract or block data without any pre-configurations. Data can then be enriched using pre-built sources or the customer database.
All event data from all servers can be monitored in Coralogix’s UI or in your own terminal via CLI with less than a 5-second latency. Events in the live tail can be filtered by app or subsystem or according to any |grep/text/regex query.
Compliance data can be identified and archived at a minimal cost. The rest of the data runs through the monitoring engine and is then sent to archive or to hot storage (only for data that is frequently searched).
Machine-learning algorithms cluster countless individual events into a finite number of templates to enable monitoring of common events and the identification of anomalies.
Event data can be aggregated into metrics via standard query. Aggregated metrics retention period can be set to any length, providing ample opportunity for data analysis without retention restrictions.
Machine learning algorithms learn the typical flow of data and identify suspected errors based on correlated events, including abnormal spikes and log ratios.
Traffic mirroring via AWS VPC provides access to all data being transferred between your servers and other cloud-based infrastructure.
Real-time, low latency alerts of various criteria, manual or ML-powered are triggered for logs, metrics, tracing, and security information.
All streaming analytics are available per specifically defined user groups and permissions.
Data and insights are sent to any external source once it’s passed through the stream analysis engine.
This includes external storage locations such as an S3 bucket or the Coralogix-hosted hot storage as well as third-party visualization and alerting services.
Data can be sent to external long-term storage in Parquet format that can be also directly queried from Coralogix. Archived event data can be reindexed via direct query at any time.
All events, aggregations, and insights can be sent for visualization in our purpose-built UI, Kibana, Grafana, SQL clients, Tableau, and more.
Using the Coralogix CLI and full API support, events data can easily be exported to any third-party tool or external location.