Language Flags

Translation Disclaimer

HPCwire Enterprise Tech HPCwire Japan
DataTorrent

November 08, 2011

IBM Investing Billions in the Big Data Frontier


If you try to make a quick mental list of companies that have weathered the drastic societal, technical and economic changes of the last century, chances are, you would be able to tally them on one hand.

Out of any of those you were able to conjure in a moment’s time, outside of IBM, changes are also quite good that none had their roots in computing--and even those that latched onto computation-driven technology 50 or 60 years ago are few and far between.

According to John Kelly III, IBM’s Senior Vice President and Director of Research, this is because of his company’s emphasis on ongoing, aggressive research and development programs.

During Big Blue’s centennial “Frontiers in IT” colloquium, Kelly said his company is pushing another $6 billion into innovation, saying IBM will survive, prosper or fail based on its ability to meet bleeding-edge goals to power or create revolutionary technologies.

Kelly pointed to four distinct areas of technology that encapsulate IBM’s current and future research and development focus. Among these are big data and the associated deep analytics, learning or cognitive systems, the era of exascale, and nanotechnology.

While we will get to the last two on the list in a moment, it is worth noting that Kelly continually pointed to Watson as a shining example of what is possible in terms of deep analytics and, at the system level, the efficiency and performance required to make those types of analytical capabilities a more pervasive reality.  

The Watson connection is simple to make on the deep analytics and cognitive system level, but there are connections to the other two areas in terms of balancing performance and efficiency. For instance, Kelly says that we are rapidly migrating away from hard, leaky silicon semiconductors to a carbon-based future that marries performance and power-awareness. These technologies will in turn power exascale machines, which in turn will fuel the kinds of powerful parallel programs that will further feed into new innovations for analytics at unprecedented scale and level of detail.

In terms of data, Kelly pointed to the unprecedented growth of data sizes and stated that the real challenge isn’t the size, it’s the fact that this data is coming at us faster than before and there are demands to respond to it many times faster than we can now. He said that in order to intercept big, fast data, we need to be able to make the move from hours or minutes to milliseconds or microseconds.

Using Watson as an example, Kelly said that Watson needed to be able to hear the question as spoken naturally, process the meaning of that data to generate and answer and decide whether to answer or stay mum. This all happened in 2.5 seconds. The key here is that in order to do this, Watson had to pull exclusively from local memory and that going forward, this is how architectures will have to evolve to meet big data demands.

To further complicate the big data problem is the fact that the data that is coming in from an ever-growing array of devices and sensors is noisy and dirty. He said the key innovation area will revolve around finding ways to quickly clean and analyze the data and that these efforts will comprise an entirely new dimension in the big data problem.

Growing outward, deep analytical systems that mimic biological systems are an imminent focus for IBM. Kelly pointed to the desire to move from the era of programmable machines to the dawn of the age of cognitive ones. Outside of the massive parallel programming required to get to the cognitive level of computing, this also involves everything from making such systems incredibly efficient, which links to some of the carbon-based innovations that will replace silicon he discussed in his tangent about nanotechnology.

Speaking of the nanotech angle, Kelly pointed to some specific examples of how IBM is reshaping nanotechnology, pulling in examples of research and development projects like the DNA transistor that allows for on-the-spot personalized medicine. This “transitor” device, which the company is working on with drug company Roche, sends DNA strands directly into a “nanopore” so an electric sensor can read the genetic information. Another example at IBM is the ongoing development of anti-bacterial nanoparticles that can detect and destroy antibiotic-resistant bacteria (and are even biodegradable).

It’s hard not to sit back and admire any company that has been able to negotiate dramatic changes in technology and the way people interact with new innovations. If Kelly is correct and the secret of the sauce is in pure R&D, they will be one the few technically-geared American companies to weather past the 150 year mark.

Share Options


Subscribe

» Subscribe to our weekly e-newsletter


Discussion

There are 0 discussion items posted.

 

Most Read Features

Most Read News

Most Read This Just In

ISC'14

Sponsored Whitepapers

Planning Your Dashboard Project

02/01/2014 | iDashboards

Achieve your dashboard initiative goals by paving a path for success. A strategic plan helps you focus on the right key performance indicators and ensures your dashboards are effective. Learn how your organization can excel by planning out your dashboard project with our proven step-by-step process. This informational whitepaper will outline the benefits of well-thought dashboards, simplify the dashboard planning process, help avoid implementation challenges, and assist in a establishing a post deployment strategy.

Download this Whitepaper...

Slicing the Big Data Analytics Stack

11/26/2013 | HP, Mellanox, Revolution Analytics, SAS, Teradata

This special report provides an in-depth view into a series of technical tools and capabilities that are powering the next generation of big data analytics. Used properly, these tools provide increased insight, the possibility for new discoveries, and the ability to make quantitative decisions based on actual operational intelligence.

Download this Whitepaper...

View the White Paper Library

Sponsored Multimedia

Webinar: Powering Research with Knowledge Discovery & Data Mining (KDD)

Watch this webinar and learn how to develop “future-proof” advanced computing/storage technology solutions to easily manage large, shared compute resources and very large volumes of data. Focus on the research and the application results, not system and data management.

View Multimedia

Video: Using Eureqa to Uncover Mathematical Patterns Hidden in Your Data

Eureqa is like having an army of scientists working to unravel the fundamental equations hidden deep within your data. Eureqa’s algorithms identify what’s important and what’s not, enabling you to model, predict, and optimize what you care about like never before. Watch the video and learn how Eureqa can help you discover the hidden equations in your data.

View Multimedia

More Multimedia

NVIDIA

Job Bank

Datanami Conferences Ad

Featured Events

May 5-11, 2014
Big Data Week Atlanta
Atlanta, GA
United States

May 29-30, 2014
StampedeCon
St. Louis, MO
United States

June 10-12, 2014
Big Data Expo
New York, NY
United States

June 18-18, 2014
Women in Advanced Computing Summit (WiAC ’14)
Philadelphia, PA
United States

June 22-26, 2014
ISC'14
Leipzig
Germany

» View/Search Events

» Post an Event