Follow Datanami:
August 29, 2012

Big Data: A View from Wall Street

Alexander Makeyenkov

Currently there are an estimated 1.2 trillion gigabytes of data floating cyberspace that Wall Street is trying to harness. A shocking figure..and one that is expected to grow at a rate of 800% over the next five years.

These astounding numbers are transforming businesses and the financial services industry when historically financial services technology outpaced the technological demands from business units. The recent rise in big data’s prevalence is posing challenges for technology as it confronts more data than ever before, along with enormous opportunities.

Financial institutions are looking to develop high-performance real-time applications with a number of goals in mind, both business and technical. From a business standpoint, they are trying to illustrate the lifecycle of their products: where the orders come from, how they are handled, and ultimately how they reach their destinations.

The technical goals are to ensure that all of these processes are displayed, in real time, and that the system is able to handle the plethora of (sometimes 100 million per day depending on the size of the firm) transactions that the firm engages in.  The challenge is then heightened as the data flow can be very inconsistent, mostly peaking around market open and close.  Traffic flow and advanced algorithmic trading includes futures and options, among others, which all are processed with the highest level security standard. 

Due to strict regulatory policies, this “big data” cannot be stored in cloud-systems.  Therefore, the challenges are in creating solutions that have corresponding and agreeable infrastructures. This vital element is one that financial institutions, whether they are big banks or hedge funds, must cope with. Without clouds, they must determine what their best means to upgrade systems, account for growth and advancements, all-the-while ensuring that the data is securely backed up.  Institutions try and predict future throughput/memory/CPU required, but this cyclical activity reduces efficiency and dramatically increases redundancy and costs.

In this day-and-age, all companies invest in technology. In finance, those investments tend to be higher out of necessity, but they still vary throughout the industry. A company can choose to keep expenses down and invest very little. By doing so, they expose their systems of frequent mishaps and occasional failures. There are ways they can try and compensate: strict quality control processes, zero issue tolerance, rigorous testing, careful documentation, and specification phases. 

These complacent preventative approaches might be suitable in the near term but they neglect the ability to sustain volume fluctuations and growth for the business, overlooking the whole point.  Not to mention, a client’s trust is hard to earn back and a system failure creates a looming skepticism that has unforeseeable impacts on the business. 

The recession and economic turmoil affects entire businesses and IT is no exception.  Cost-cutting measures in the technology space are creating more data redundancy, fault tolerance, and vulnerability. Even if financial resources are not the issue, current human resources could be. The amount of knowledgeable, experienced IT professionals within the financial services industry equipped to deal with high-performance/high-load systems is disproportionate to the required technicians. 

IT professionals within the financial services industry need an intricate knowledge of how markets work, as well as communications throughout the system. Each institution has its own. Typically internal IT personnel have only experienced working within the financial sector. Because of such focused experience, these professionals are not familiar with the influx of new system challenges, nor are have they the ingenuity to innovate with newer technologies.

Obviously these institutions encounter malfunctions and then look to external support to quickly resolve the issue. Often this is a cost-cutting measure because it is more economical to outsource in dire circumstances. Large IT consultancies that field these calls are often plagued with similar in-house inertia.  They neither have the agility nor are asked to address the deeper-systemic issues, solving only the immediate problem and leaving the institution vulnerable.

A reassessment of the economic calculations that institutions make is necessary in many facets of their IT. This would lend itself to better in-house practices, alter the outsourcing of assignments, and provide better returns in the long run. After all, this IT investment supports growth but can also stifle growth.  Cloud solutions are on the rise, exceeding $21.5 billion in 2010 and predicted to reach $73 billion by 2015. 

Although clouds are currently prohibited for financial institutions, offering a private cloud system changes the approach entirely. Instead of solving the same problems repeatedly, an organization could create a pool of software and hardware for use when needed. The simplification would reduce costs (both software and hardware), improve quality (by having the most qualified employees working on critical parts of the system), and hedge for future hardware demands. Complying with Dodd-Frank regulation would be much easier if all of the data were consolidated in one place. 

The industry is confronting many problems, but practical solutions are already in existence and their application makes this an industry with an enormous amount of opportunity. Digitizing is more and more prevalent – most banks are expected to cut back on physical branches – and use of technology is more common around the world in all industries. Financial institutions have historically led this trend, and there is potential for it to continue.

Alexander Mkeyenkov is Senior Vice President of Capital Markets at DataArt, a custom software development firm that specializes in building advanced solutions for the financial services industry.

Related Stories

How Four Financial Giants Crunch Big Data

Chips, Stats & Stones: A Morning with SAS CEO Dr. Jim Goodnight

Using an In-Memory Data Grid for Near Real-Time Data Analysis

Datanami