Capital Markets Push CEP to the Limit
The capital markets game changes rapidly, with the introduction of ever-more compute horsepower and frameworks that aim to help these users leverage massive, diverse datasets in near-real time.
From algorithmic trading, liquidity discovery, and real-time risk aggregation to overall market surveillance, there is little doubt that this is the corner of the financial services industry where the early adopters hang.
High performance hardware aside, the need for robust software platforms to ingest new sources of market and risk data (an increasing amount of it unstructured) and turn it over for rapid decision-making is shoving complex event processing (CEP) development forward—not just to keep up with advances in big iron, but to match big data as well. But for many firms, from small hedge funds to large investment banks, the issue is not so much just big data—it’s fast, complex data.
Due to the increasing need to make use of an expanding array of data streams, some analysts suggest that we’re on the edge of a complex event processing platform pickup—and not just in the expected areas, which historically have included capital markets.
The Aite Group, which authored the study that the above graphic was pulled from, states that over the last five years, CEP has transformed from an emerging technology to an indispensable platform for capital markets customers—and those in other industries as well. They claim that CEP’s most consistent growth has occurred in banking, where fraud detection, online banking, and multichannel-marketing use cases reign supreme. It has also grown globally and into industries beyond capital markets. Aite Group expects to see CEP grow more than 15% annually over the next two years—nearly three times the rate of the average financial services technology budget.
There has been little argument that the big data movement has spurred much of this growth, but it serves us well to repeat that big data is certainly not just about size. It’s the volume, velocity and variety of that data that’s cinching the deal for new capital markets CEP deals. As TIBCO’s CTO for Business Rules and Complex Event Processing, Paul Vincent says that CEP and big data for capital markets (and beyond) is less about size than the hype reflects, especially when it comes to the real-time needs of many subsets within capital markets.
Vincent claims that these days, “CEP is mostly dealing with normal volumes of data at low to high velocities being tested against normal(ish) volumes of data (maybe up to terabytes, but not petabytes).” He says that when it comes to exploiting big data for capital markets and beyond, the issues of scale and the velocity of the incoming events are just as important as mere volume. Additionally, these elements can’t be considered in a vacuum—the scale, velocity and structure of the existing data is important as well since it needs to be tied into the processing task.
Neil McGovern, who directs product marketing for the capital markets arm at Sybase, claims that making the move to CEP platforms provides a number of benefits for his special group of users. He says that the attractors for capital markets users include the ability to address real-time information and applications without disrupting legacy environments, build and deploy new code quickly, and enhance existing applications to address emerging real-time needs. ,
In the view of Sybase, which pushes its own CEP platform, Aleri, this is an emerging “paradigm of computing” that is powered by “event driven architecture that offers the ability to analyze extremely large amounts of event and other data from disparate sources with very low latency and high throughput.”
Louis Lovas directs the software solutions arm of OneMarketData, which provides, among other things, a tick platform to suit the CEP needs of capital markets customers. For the capital markets services they provide, the data challenges, as McGovern suggested, go far beyond size. Complexity, storage, access and analysis all prove to be issues for performance-conscious capital markets customers.
He agrees that the challenges for the software side of the capital markets industry include maximizing new advances in hardware and managing, storing and making use of the complex time-series data that the financial industry relies upon. He notes, however, that these capital markets “big data” problems are nothing new, but on the plus side, the hype around the big data “bubble” has drawn more attention to the evolving platform and hardware needs—along with more solutions.
The company’s platform hinges on vast wells of quantitative research data, which allow quants to build trade models for risk mitigation and provide time-sensitive information that leads to profitable trading. These models are not just used by traders; Louvas says that these models, with some variations, guide everything from small hedge funds to big investment banking decisions.
More specifically, the data is being used to find mathematical (or econometric) models that factor in several components, including market changes on the small scale, and movements in the greater economy, activity in other markets (futures, for example) all within the framework of larger geopolitical contexts.
Lovas says that all of this means that keeping up requires capturing massive amounts of data. For example, he says that in the options market alone, daily peaks were in the range of several million messages per second. This figure has grown over 100 percent since 2010, he says, and that’s just for one asset class. He told us that the amount of data will continue to mount since most firms are trading in multiple asset classes.
According to Streambase Systems CEO, Mark Palmer, the big data challenges like those Lovas discussed indicate a need to push the CEP envelope. Palmer, who speaks regularly about event processing for other markets, including Web 2.0 businesses, says traditional custom-coded infrastructures are being outpaced by the speed and volume of the data hitting their organizations. In the view of Streambase, environments that require solutions for algorithmic trading, feed processing and condition, cost analysis and risk management need real-time results that emphasize development in overall data management capabilities.
CEP platforms and capital markets customers have always gone hand in hand given the unique processing requirements, but CEP has been making steady inroads in other arenas. The concept of “continuous BI” and the use of diverse streams of unstructured and structured “big data” across multiple, streaming sources will continue to push CEP capabilities.