High performance computing (HPC), cloud computing, and open source are the technology implements that have ushered in the big data era, said Intel CTO/GM, Girish Juneja, this week at Strata 2013.
|Girish Juneja, Intel CTO and GM of Big Data|
“Over the course of the last 10 years, we have worked in HPC environments - we have worked in large computation data sets, and optimized those technologies to deliver the performance that is needed.”
After acknowledging the high performance hardware infrastructure piece, Juneja dove deep into discussion about how Intel intends to move the needle through continued/increased support of the open source community, commenting that open source has evolved from being a good enough substitute for proprietary software, to be the leading driver of innovation in computing.
“Intel has worked in open source for the last dozen or so years,” explained Juneja. “We are, for those that may not know, one of the largest contributors to the Linux kernel. We have been working in Apache for a long time, and in the cloud environment, OpenStack has been one of the major initiatives from Intel.”
“It should come as no surprise to this audience that we are converging these three trends together, and actually providing Intel distribution for Apache Hadoop software to the community,” said Juneja, commenting on Intel's recent announcement.
Junjea commented that Intel’s involvement with Hadoop stems primarily from customer use cases in which gaps in performance, security and management were chief issues with the deployments. To address these areas, Juneja explains that Intel has launched three open source projects:
- Project Rhino (security and compliance)
- Project Panthera (analytics with SQL and Hadoop)
- Graph Builder (graph construction using Hadoop)
Before handing the stage over to their partners at SAP, Juneja gave a hat tip to the partner ecosystem that they are working with, and then revealed that Intel has recently become a platinum sponsor of the Apache Software Foundation.
“We can all together strengthen this common horizontal layer, make it the substrate on which innovation can happen, not just now, but several years down the road as the datacenter evolves to a much more scalable infrastructure,” said Juneja.