Follow Datanami:
March 7, 2013

Intel: Real Time is Where Hadoop Will Shine

Isaac Lopez

Hadoop has hardly scratched the surface of its potential, says Intel’s VP of the Architecture division, Boyd Davis, who believes that once users of the framework ramp up to real-time, they’ll appreciate the optimizations that Intel brings with its distribution of the framework.

“I think one of the myths about the Hadoop framework is that it isn’t demanding of the underlying infrastructure,” said Davis in a video released last week.  Davis says this myth is due to organizations not yet using Hadoop for its full capability yet. “People are doing only batch processing of relatively simple data types, not trying to get that data more in real time,” argues Davis.

Davis says that optimizations that Intel is bringing to the table with their partners will head these challenges off at the pass by providing a hardware framework that is optimized for real-time, multi-application utilization of the Hadoop software framework.

“Intel has a shared vision about making the Hadoop framework a foundation layer for multiple applications where customers can store their data once, and do both SQL processing, or NoSQL processing, or Map Reduce processing, and really take the same data set and get value out of it in multiple different ways,” said Davis. “I think that it’s a bit of a unique vision for us to establish [Hadoop] as a horizontal ingredient in a broader range of solutions.”

Intel, of course, announced the integration of Hadoop into their Xeon line of processors last week, saying that they are using their considerable resources to bring both hardware and software optimization to the framework.  Partners include Cisco, SAP, Savvis, Red Hat among others.

Agreeing with Davis’s assessment was VP and CTO of Cisco’s Data Center Group, Paul Perez, who sees Hadoop as an indispensable part of the enterprise future. Hadoop, says Perez, “is definitely the new substrate for data intensive computing in enterprise, and also for service providers. We see optimization not only for Hadoop implementations in private clouds and enterprise, but also in optimizations that enable multi-tenancy so that Hadoop can be offered as a service by service providers as well.”

Perez explains that Cisco’s role in the partnership includes extending on the built in automation that they have with their UCS while leveraging Intel’s optimization to help create an environment for Hadoop that is easy to plan, provision and scale.

Related Articles:

Do NOT follow this link or you will be banned from the site!
Share This