How We Implemented a Zero-Error Policy Using Coralogix
With dozens of microservices running on multiple production regions, getting to a point where any error log can be immediately identified and resolved feels like a…
Whether you are just starting your observability journey or already are an expert, our courses will help advance your knowledge and practical skills.
Expert insight, best practices and information on everything related to Observability issues, trends and solutions.
Explore our guides on a broad range of observability related topics.
An organization’s security protocols are vital to maintaining transparency, compliance with government regulations, and trust with customers. On April 28, 2022, the Indian Computer Emergency Response Team (CERT-In) released updated directions for compliance requirements for all India-based companies and organizations with Indian clients.
It’s critical to keep in mind that these rules are in place to keep organizations and customers safe from cybersecurity attacks and to see that the correct steps are being taken in a timely manner.
So what does this mean for you?
This means you’ll have to retain your log data for 180 days, among a few additional updates, to meet all the Indian Compliance regulation requirements.
And although this might seem overwhelming and financially burdensome, we’ve got you fully covered on all bases.
With growing log volumes and increasingly strict retention regulations, the cost of storing and analyzing them with traditional approaches can be a significant challenge and financial burden. Coralogix uses proprietary Streama© technology to analyze observability data in-stream without relying on indexing or a centralized data store.
This means companies can centralize their observability data and ensure they remain compliant with all local and global security requirements without breaking the bank.
As data enters Coralogix, it is parsed and enriched and then stored in an Amazon S3 archive bucket that you control. This means no matter what level of analysis and monitoring you need, you always maintain full access to your data – for as long as you need it. Configure your bucket to reside in AWS’s Mumbai region with 180-retention for compliance with the updated CERT-In directive.
Query your archive directly from the Coralogix UI or via CLI with no additional compute cost or impact on your daily quota. Data can then be easily exported for an audit or reindexed to the Coralogix platform for investigation.
Part of what sets the Coralogix platform apart is the ability to extract infinite value from your data without ever needing to index it.
Use the Logs2Metrics feature to generate metrics on the fly from your logs and send the raw log data directly to your archive. The metrics are stored for a full year for visualization and alerting at no additional cost, and the raw data can be accessed directly from your archive at any time. Advanced alerting with dynamic thresholds, log clustering, and anomaly detection can all also be leveraged without indexing.
This means that you can monitor your data with more precision, better performance, and at a much lower cost.
As data volumes continue to grow, costs typically increase as well. We understand different data is used for different goals. That’s why with our technology, you can designate the data to different analytics pipelines by use case, allowing you to reduce costs while maintaining system visibility.
Use our TCO Optimizer to prioritize your data to 1 of 3 data pipelines according to your analytics and monitoring needs so that you pay based on the value of your data rather than volume.
Compliance Pipeline: Within this pipeline, you can store data that’s needed for compliance purposes. Data in this pipeline is written to your own archive bucket after passing through the parser, enrichment, and Live Tail. It can still be queried at any time, without counting against the quota, ensuring you meet all the CERT-In guidelines.
Monitoring Pipeline: Any data that needs to be visualized, tracked, alerted, and monitored in real-time will flow in the log monitoring pipeline. Within the pipeline, you can leverage the Logs2Metrics, Alerting, and Anomaly Detection features without ever needing to index the raw log data.
With these features, you’ll be able to quickly and easily identify security risks before they affect your business or customers. Remember that according to the new directions, you will need to report them to CERT-In within 6 hours.
Frequent Search Pipeline: Any data queried frequently for investigations or troubleshooting, critical or error level logs, for example, can be sent to the Frequent Search pipeline. In addition to the advanced features in the Monitoring pipeline, this data will be indexed and put in hot storage to enable lightning-fast queries.
Between all three pipelines, you have full control over where to place your data, access to all Coralogix features for all users, and have fully optimized costs with no surprises.
Regardless of which pipeline your data is sent to, all of it will be stored in your archive bucket, so you ALWAYS have full access and control in compliance with government regulations.
All in all, no matter which pipelines your data is in, ALL DATA is accessible from the archive regardless of indexing and retention. Rest assured, you can easily retain all your logs for 180 days (or however long you want), maintain full oversight of your system’s health, work with a cost-effective solution, and meet full compliance requirements.
Learn more about the Coralogix platform or request a demo at any time for a personalized walkthrough!
With dozens of microservices running on multiple production regions, getting to a point where any error log can be immediately identified and resolved feels like a…
The wide-spread adoption of cloud infrastructure has proven to be highly beneficial, but has also introduced new challenges and added costs – especially when it comes…
2021 was a crazy year, to say the least, not only did we welcome our 2,000th customer, we announced our Series B AND Series C funding…