Leverage Big Data
Language Flags

Translation Disclaimer

HPCwire Enterprise Tech HPCwire Japan
DataTorrent

January 28, 2013

CTO Sees Virtualized Big Data as Next Challenge


It’s no secret that the big data phenomenon is reshaping overall approaches to enterprise IT, but it’s just one part of a grander restructuring that has been led by other trends, including wider-scale adoption of public cloud resources and virtualization.

Not long ago, virtualization buff and Silver Peak CTO, David Hughes argued that the convergence of trends like virtualization and increasing complex data volumes that need to be moved quickly mark another shift for IT leaders.

As Hughes told Datanami, the biggest changes have to do with the networking department and moving big data over distance. With virtualization, we will find more storage and application owners wanting to solve the data mobility challenge without involving the networking department.  Things like software-defined acceleration are making it possible for storage and server administrators to move and accelerate their workloads with point-and-click simplicity…all from the virtualization management console.

We discussed these issues in greater detail with the virtualization CTO in the interview below, which is capped by a video at the bottom of this article that points to more of the virtualization shift for overall IT organizations.

Datanami: When it comes to conversations about "big data" what is it about software-defined acceleration that is overlooked?

Hughes: When it comes to “big data” and software-defined acceleration, it’s less about what’s being overlooked and more about that fact that it hasn’t been done yet.  As data volumes increase and organizations begin pulling in data from multiple sources across a wide area network (WAN), they must contend with the adverse effects of distance, network quality and capacity, all of which can slow down the transfer and accessibility of that data.  Software defined acceleration provides a more simple and easily accessible model for moving larger volumes of data more quickly over longer distances.

As well as having to move around vast amounts of big data, it is also crucial that this data be protected and kept secure, both for regulatory and compliance reasons, and simply to maintain customer trust. If you look at recent natural disasters, including Hurricane Sandy or the Japan Tsunami and resulting Fukushima Daiichi nuclear disaster, we find that it is no longer sufficient to replicate data across town or even in the same state.  You can no longer replicate from New York to New Jersey, you need to replicate data over greater distance. 

IDC estimates that 35% of the information in the digital universe needs protection, but that only 19 percent actually is protected. A recent study conducted by Forrester Consulting on behalf of Silver Peak also found that a large majority, 72%, agree or strongly agree that they would like to replicate more of their application data than they do currently, and 62% would like to replicate more frequently. Big Data is more than a storage and server challenge, it is a challenge for the network, and software defined acceleration stands to play a critical role in big data in the future.

Datanami: What seem to be the hottest industries for what you and your competitors provide and what is it about their workloads that is optimal?

Hughes: The industry for data acceleration over distance is fairly horizontal.  As data volumes increase, disaster recovery requirements become more ubiquitous and applications move to the cloud, IT professionals across a variety of vertical markets need to move more data quickly over longer distances.  Silver Peak data acceleration software overcomes the distance, quality and capacity challenges that are inherent in today’s wide area networks. 

Industries where we do see larger volumes of data and the need for higher-performance, higher-capacity technologies are high-tech and oil/gas.  With high-tech in particular, if you look at the Googles, Amazons and Facebooks of the world, they are dealing with lots and lots of data being transmitted on a global scale.  But it’s more than just providing accessibility to that data on a global scale, it’s also about protecting that data to ensure the availability and sustainability of their business.  These requirements place a huge dependency on the wide area network.

Datanami: What is the next-generation of data movement and management challenges we'll face in 2013?

Hughes: The next generation of data movement and management challenges will be focused on data replication and the movement of data over greater distances.  As data volumes grow and data replication requirements increase, more strain is being placed on a wide area networking infrastructure. 

A lot of people assume that because bandwidth is getting cheaper and bandwidth rates are going up, that the WAN bandwidth bottleneck is going away or will go away.  What’s interesting is that the growth of data continues to exceed the rate of which new services, new technologies and bandwidth upgrades are being deployed within carrier networks.  The increase in traffic, whether it be the amount of storage data, the uptake of replication data, or the overall growth of traffic on the Internet, these far outpace the level of innovation and the price drops your seeing in enterprise WAN services.  This translates into there being a worse WAN bottleneck today then there was 10 years ago.

Share Options


Subscribe

» Subscribe to our weekly e-newsletter


Discussion

There are 0 discussion items posted.

 

Most Read Features

Most Read News

Most Read This Just In

ISC'14

Sponsored Whitepapers

Planning Your Dashboard Project

02/01/2014 | iDashboards

Achieve your dashboard initiative goals by paving a path for success. A strategic plan helps you focus on the right key performance indicators and ensures your dashboards are effective. Learn how your organization can excel by planning out your dashboard project with our proven step-by-step process. This informational whitepaper will outline the benefits of well-thought dashboards, simplify the dashboard planning process, help avoid implementation challenges, and assist in a establishing a post deployment strategy.

Download this Whitepaper...

Slicing the Big Data Analytics Stack

11/26/2013 | HP, Mellanox, Revolution Analytics, SAS, Teradata

This special report provides an in-depth view into a series of technical tools and capabilities that are powering the next generation of big data analytics. Used properly, these tools provide increased insight, the possibility for new discoveries, and the ability to make quantitative decisions based on actual operational intelligence.

Download this Whitepaper...

View the White Paper Library

Sponsored Multimedia

Webinar: Powering Research with Knowledge Discovery & Data Mining (KDD)

Watch this webinar and learn how to develop “future-proof” advanced computing/storage technology solutions to easily manage large, shared compute resources and very large volumes of data. Focus on the research and the application results, not system and data management.

View Multimedia

Video: Using Eureqa to Uncover Mathematical Patterns Hidden in Your Data

Eureqa is like having an army of scientists working to unravel the fundamental equations hidden deep within your data. Eureqa’s algorithms identify what’s important and what’s not, enabling you to model, predict, and optimize what you care about like never before. Watch the video and learn how Eureqa can help you discover the hidden equations in your data.

View Multimedia

More Multimedia



Job Bank

Datanami Conferences Ad

Featured Events

May 5-11, 2014
Big Data Week Atlanta
Atlanta, GA
United States

May 29-30, 2014
StampedeCon
St. Louis, MO
United States

June 10-12, 2014
Big Data Expo
New York, NY
United States

June 18-18, 2014
Women in Advanced Computing Summit (WiAC ’14)
Philadelphia, PA
United States

June 22-26, 2014
ISC'14
Leipzig
Germany

» View/Search Events

» Post an Event