DataTorrent
Language Flags

Translation Disclaimer

HPCwire Enterprise Tech HPCwire Japan


August 29, 2012

Big Data: A View from Wall Street


Currently there are an estimated 1.2 trillion gigabytes of data floating cyberspace that Wall Street is trying to harness. A shocking figure..and one that is expected to grow at a rate of 800% over the next five years.

These astounding numbers are transforming businesses and the financial services industry when historically financial services technology outpaced the technological demands from business units. The recent rise in big data’s prevalence is posing challenges for technology as it confronts more data than ever before, along with enormous opportunities.

Financial institutions are looking to develop high-performance real-time applications with a number of goals in mind, both business and technical. From a business standpoint, they are trying to illustrate the lifecycle of their products: where the orders come from, how they are handled, and ultimately how they reach their destinations.

The technical goals are to ensure that all of these processes are displayed, in real time, and that the system is able to handle the plethora of (sometimes 100 million per day depending on the size of the firm) transactions that the firm engages in.  The challenge is then heightened as the data flow can be very inconsistent, mostly peaking around market open and close.  Traffic flow and advanced algorithmic trading includes futures and options, among others, which all are processed with the highest level security standard. 

Due to strict regulatory policies, this “big data” cannot be stored in cloud-systems.  Therefore, the challenges are in creating solutions that have corresponding and agreeable infrastructures. This vital element is one that financial institutions, whether they are big banks or hedge funds, must cope with. Without clouds, they must determine what their best means to upgrade systems, account for growth and advancements, all-the-while ensuring that the data is securely backed up.  Institutions try and predict future throughput/memory/CPU required, but this cyclical activity reduces efficiency and dramatically increases redundancy and costs.

In this day-and-age, all companies invest in technology. In finance, those investments tend to be higher out of necessity, but they still vary throughout the industry. A company can choose to keep expenses down and invest very little. By doing so, they expose their systems of frequent mishaps and occasional failures. There are ways they can try and compensate: strict quality control processes, zero issue tolerance, rigorous testing, careful documentation, and specification phases. 

These complacent preventative approaches might be suitable in the near term but they neglect the ability to sustain volume fluctuations and growth for the business, overlooking the whole point.  Not to mention, a client’s trust is hard to earn back and a system failure creates a looming skepticism that has unforeseeable impacts on the business. 

The recession and economic turmoil affects entire businesses and IT is no exception.  Cost-cutting measures in the technology space are creating more data redundancy, fault tolerance, and vulnerability. Even if financial resources are not the issue, current human resources could be. The amount of knowledgeable, experienced IT professionals within the financial services industry equipped to deal with high-performance/high-load systems is disproportionate to the required technicians. 

IT professionals within the financial services industry need an intricate knowledge of how markets work, as well as communications throughout the system. Each institution has its own. Typically internal IT personnel have only experienced working within the financial sector. Because of such focused experience, these professionals are not familiar with the influx of new system challenges, nor are have they the ingenuity to innovate with newer technologies.

Obviously these institutions encounter malfunctions and then look to external support to quickly resolve the issue. Often this is a cost-cutting measure because it is more economical to outsource in dire circumstances. Large IT consultancies that field these calls are often plagued with similar in-house inertia.  They neither have the agility nor are asked to address the deeper-systemic issues, solving only the immediate problem and leaving the institution vulnerable.

A reassessment of the economic calculations that institutions make is necessary in many facets of their IT. This would lend itself to better in-house practices, alter the outsourcing of assignments, and provide better returns in the long run. After all, this IT investment supports growth but can also stifle growth.  Cloud solutions are on the rise, exceeding $21.5 billion in 2010 and predicted to reach $73 billion by 2015. 

Although clouds are currently prohibited for financial institutions, offering a private cloud system changes the approach entirely. Instead of solving the same problems repeatedly, an organization could create a pool of software and hardware for use when needed. The simplification would reduce costs (both software and hardware), improve quality (by having the most qualified employees working on critical parts of the system), and hedge for future hardware demands. Complying with Dodd-Frank regulation would be much easier if all of the data were consolidated in one place. 

The industry is confronting many problems, but practical solutions are already in existence and their application makes this an industry with an enormous amount of opportunity. Digitizing is more and more prevalent - most banks are expected to cut back on physical branches - and use of technology is more common around the world in all industries. Financial institutions have historically led this trend, and there is potential for it to continue.

Alexander Mkeyenkov is Senior Vice President of Capital Markets at DataArt, a custom software development firm that specializes in building advanced solutions for the financial services industry.

Related Stories

How Four Financial Giants Crunch Big Data

Chips, Stats & Stones: A Morning with SAS CEO Dr. Jim Goodnight

Using an In-Memory Data Grid for Near Real-Time Data Analysis

Share Options


Subscribe

» Subscribe to our weekly e-newsletter


Discussion

There is 1 discussion item posted.

Private vs. public clouds?
Submitted by vas on Aug 31, 2012 @ 2:02 PM EDT


Alexander,

You say: "Due to strict regulatory policies, this “big data” cannot be stored in cloud-systems."

Do you mean "...cannot be stored in _public_ cloud-systems."? Surely a private cloud that complies with the appropriate regulations can deliver many of the same benefits that companies in other sectors are realizing through their use of public clouds (albeit at higher TCO, perhaps).

Post #1

 

Most Read Features

Most Read News

Most Read This Just In

Cray Supercomputer

Sponsored Whitepapers

Planning Your Dashboard Project

02/01/2014 | iDashboards

Achieve your dashboard initiative goals by paving a path for success. A strategic plan helps you focus on the right key performance indicators and ensures your dashboards are effective. Learn how your organization can excel by planning out your dashboard project with our proven step-by-step process. This informational whitepaper will outline the benefits of well-thought dashboards, simplify the dashboard planning process, help avoid implementation challenges, and assist in a establishing a post deployment strategy.

Download this Whitepaper...

Slicing the Big Data Analytics Stack

11/26/2013 | HP, Mellanox, Revolution Analytics, SAS, Teradata

This special report provides an in-depth view into a series of technical tools and capabilities that are powering the next generation of big data analytics. Used properly, these tools provide increased insight, the possibility for new discoveries, and the ability to make quantitative decisions based on actual operational intelligence.

Download this Whitepaper...

View the White Paper Library

Sponsored Multimedia

Webinar: Powering Research with Knowledge Discovery & Data Mining (KDD)

Watch this webinar and learn how to develop “future-proof” advanced computing/storage technology solutions to easily manage large, shared compute resources and very large volumes of data. Focus on the research and the application results, not system and data management.

View Multimedia

Video: Using Eureqa to Uncover Mathematical Patterns Hidden in Your Data

Eureqa is like having an army of scientists working to unravel the fundamental equations hidden deep within your data. Eureqa’s algorithms identify what’s important and what’s not, enabling you to model, predict, and optimize what you care about like never before. Watch the video and learn how Eureqa can help you discover the hidden equations in your data.

View Multimedia

More Multimedia

ISC'14

Job Bank

Datanami Conferences Ad

Featured Events

May 5-11, 2014
Big Data Week Atlanta
Atlanta, GA
United States

May 29-30, 2014
StampedeCon
St. Louis, MO
United States

June 10-12, 2014
Big Data Expo
New York, NY
United States

June 18-18, 2014
Women in Advanced Computing Summit (WiAC ’14)
Philadelphia, PA
United States

June 22-26, 2014
ISC'14
Leipzig
Germany

» View/Search Events

» Post an Event