Follow Datanami:
January 24, 2012

Complex Event Processing, Big Data Collide

Datanami Staff

When it comes to evaluating options for complex event processing, there are many options to explore from established software vendors, including Sybase, Oracle, Tibco, Streambase, Informatica, and so on.

While these companies tend to appear in the mainstream technology media rather frequently, we’ve also been watching the periphery of the CEP vendor landscape. This has been especially interesting as many software vendors in the CEP space have had long-running technologies that have been in place well before the term “big data” entered into common dialogue.

As Tibco’s Paul Vincent noted, there are some interesting dynamics running between the big data “phenomena” and complex event processing. As he stated earlier this month:

From a “big data” perspective, event processing use cases can include customer purchase records, credit card transactions, phone voice packets or text messages, inventory updates, operational sensor reports, etc etc. But from the event processing perspective (i.e. actually exploiting “big data”) there is another dimension to consider: the scale and velocity of the incoming events versus the scale and velocity (and structure) of the existing data it needs to be related to and/or processed against. Some examples might be:

  • large volumes of data at high velocities, compared to large volume of data
    = national security applications
  • large volumes of data at high velocities, compared to normal volume of data
    = sensor processing like Radar
  • normal volumes of data at high velocities, compared to large volume of data
    = web search
  • normal volumes of data at high velocities, compared to normal volume of data
    = automated trading in Capital Markets

One small company that is adapting to the “big data” change is New Jersey-based EsperTech. The small software company has grown from its open source roots with notable new customers since their public release of the software in 2006, capturing users in financial services, network management, fraud detection and RFID markets—all of which are ESP/CEP.

According to Tom Bernhardt, CTO at EsperTech, Inc., “big data” is nothing new, at least from the perspective of a vendor that has been in the CEP game since 2006. Even still, he notes, there are some changes that are reshaping the way complex event processing companies are approaching their products.

As Bernhardt stated, “Now, with Hadoop/MapReduce and NoSQL, we are enhancing the technologies in that space. The contribution of big data to event stream analysis is that is enables more organizations to store and analyze events offline– not soft real time, but batch with high latencies and at low cost, especially when used with cloud/virtualization.”

Bernhardt added that, “Most companies see CEP as a critical competitive advantage and are very secretive.” He notes that many others are taking advantage of the open source core platform his company offers, but choose to keep that fact under wraps. Still, there are a number of customers that are under EsperTech’s CEP spell, including cloud platform giants Rackspace and Huawei and others, including Raytheon and F5 Networks.

Bernhardt’s company provides a solution to address use cases that require soft real-time event stream analysis that can be applied across several different verticals. He stresses the importance of the company’s Event Processing Language, which is a declarative language that solves event stream analysis in a direct, concise manner. The company’s Java Esper and .NET NEsper products, as Bernhardt says, “enable rapid development of applications that process large volumes of incoming messages or events.”

The premise of the company is quite simple; they provide an open source approach to solving some of the challenges more enterprise users are facing—and take into account the points Tibco’s Paul Vincent made in his post above.

 As EsperTech puts it: “Classical databases or distributed caches are passive data structures that create frozen and slow assets which require explicit querying to make sense of. By turning to Event Stream Processing (ESP) and Complex Event Processing (CEP) data flow in real-time through queries, one can ensure immediate reactivity and reduced burden of custom development to make sense of what’s going on.”

Over the course of the next year we’ll see a new crop of complex event processing companies pop up and will be watching to see how the existing CEP ecosystem adapts to a new base of users demanding solutions to accommodate “big data.”

Related Stories

CEP Sparks Network Vendor Momentum

Sybase Brings Continuous Intelligence to Financial Services

TIBCO CEO Outlines Enterprise 3.0 Approach

Datanami