The game has changed. It's not about the speed at which you can ingest events per second or flows per second - and then write them to a downstream database for "some day" examination. That may be fine for compliance, but compliance is not security. It's about assimilating billions of data events into a picture of actual network activity, instantly. It's about contextualizing each “needle” event with other related needles - automatically for the analyst. Left alone, each needle is perhaps seen - but not understood for its significance. Worse, it may be missed altogether. Each miss could be a significant clue to malicious activity underway. But when presented as a set of disconnected, standalone events – humans just cannot get the needles out of the proverbial haystack.
Today's security analysts' day-to-day tasks are built around trying to find ways to collect data from various network and security products, organize that data into a useful format and then determine which questions to ask of the data - with the goal being to determine whether a breach or misuse has occurred. Assuming they have the time, skill, and inclination to even get this far, they then have the gargantuan task of trying to tease out answers to those questions from a massive data set.
The Stream Processing Engine is the foundation upon which all data collection and security analytic processing is performed. It operates across Data Mining Units (DMUs) and a Module Processing Unit (MPU). The engine is unique in its ability to waterfall data through a series of interconnected modules – passing only the specific contextual information required to update downstream modules that need it. It is not a relational database. It is not a distributed map reduction technology. It is designed around complex event processing (CEP) principles. Click's Stream Processing Engine enables large amounts of telemetry data to be retained in memory for super-fast automated analytics processing.