SIEMs were a great technology when we were dealing with protecting the known, with fixed perimeters and signature-based security. But is this reflective of today’s dynamic threat landscape, with a porous perimeter and workloads moving to the cloud?
When I graduated university back in the late 80’s, I was a computer programmer for a large insurance company, working on IBM mainframe applications (IMS DB/DC and CICS DB2). These were large monolithic applications, self-contained, with long development, testing and delivery cycles that took 12-18 months. We sat down with users, collected requirements, built prototypes, and went through unit, regression and QA testing before rolling things into production – OLD SCHOOL.
Think about modern digital companies that are successful today – AirBnB, Netflix, Uber, Amazon, Skype, Twitter, LinkedIn to name a few. These companies – in order to drive continuous innovation and continue to be relevant to their customers – are leveraging micro services, containers like Docker, configuration management tools like Chef and Puppet. They are driving continuous delivery initiatives weekly, even daily, at a pace we have not seen before. And to support this rapid pace, organizations are looking to leverage modern, advanced IT infrastructure for a majority of these workloads such as from public cloud providers like AWS or Azure.
So when you think about the CI/CD lifecycle, and the cloud-based infrastructures these modern applications are running on, we are dealing with a lot of layered components – OS, Applications, NW devices, Storage devices, servers and workstations, etc. – and all this infrastructure produces a lot of data, siloed data. When you consider the volume, variety and velocity of the data streams, it becomes extremely challenging – leveraging SIEMs – to ingest this capacity and extract answers and insights in a timely fashion. The SIEM architecture was becoming their Achilles heel and maybe, more appropriately, a ball and chain around their ankle.
Additionally, as organizations move into this digital world, developing modern applications, leveraging mobile, social, information and cloud to deliver new and disruptive experiences to their customers, the predictability of workload volumes is less certain. Think about what happens to Airbnb during Thanksgiving travel season? Or Target during the Xmas shopping season? And how requests for Uber rides spike during a major sporting like the super bowl? This capacity has to be planned for, the hardware and software needs to be provisioned, the people allocated, and so on. This takes time, money and foresight. Wouldn’t a secure, highly elastic, cloud-native, security analytics service that bursts automatically as needed be a lot easier than over provisioning servers to handle peak volumes, but that sit well below capacity for the majority of the year?
To truly be rid of this ball and chain, one needs to move beyond the rigid, fixed correlation rules that generate so much alert fatigue among InfoSec teams, that they are generally ignored. These rules were great at surfacing up known events, but what about the unknown events? What happens when you do not even know the questions to ask? With millions of event and log data being generated daily, finding these indicators of compromise (IOCs) are like trying to find the needle in the haystack. It becomes humanely impossible.
This is where Security Analytics solutions steps in (please refer to Figure 1: SIEM vs. Security Analytics Checklist). By leveraging machine learning algorithms and data science, they are able to identify abstract relationships, anomalies and trends and surface up problems automatically. Security analytics solutions look at the data more holistically, providing full-stack visibility across hybrid infrastructures.
To summarize, below are the six takeaways on the SIEM vs. Security Analytics debate that I’ve pulled together based on industry analysts’ and thought leaders’ feedback. Use them as a guide for your future security solution investments.
Six Takeaways on the SIEM vs. Security Analytics Debate
- Security data is unmanageable with legacy SIEM tools
- Advanced analytics are being integrated into security markets after rule and signature based prevention systems and tuning processes struggled to detect or stop most serious breaches over the past few years.
- Security and risk professionals must evolve their tool set and capabilities to keep up with the maturing threat landscape
- Consider threats that are already inside the enterprise: SIEM tools are typically deployed to look at the perimeter of the network, yet this mentality can expose organizations to great risk
- Machine-learning algorithms and analysis techniques have advanced far beyond the capabilities of what was available in the commercial markets only two to three years ago. They also address the issue dubbed “We don’t know what we don’t know;
- Security analytics’ core function is to monitor and collect vast amounts of information from the environment to identify threats that indicate elevated risk and ultimately prevent lateral spread of those threats and data exfiltration. To succeed in this endeavor, the analytics platform performs the identification of threats and prioritization of threats without the requirement for the administrators and analysts to create policies or rules.
This is truly a transformative shift that we see once a decade. Are you ready to join the ride or are you content with the status quo?
This is the second in a series of blogs on SIEM and Security Analytics. To read last week’s post go to “SIEM: Crash and Burn or Evolution? You Decide“.
Sign up for a free trial of Sumo Logic. It’s quick and easy. Within just a few clicks you can configure streaming data, and start gaining security insights into your data in seconds.