Language Flags

Translation Disclaimer

HPCwire Enterprise Tech HPCwire Japan
Leverage Big Data'14

August 24, 2012

This Week's Big Data Big Five

Somehow September is already almost upon us, which most assuredly marks the beginning of a ramped-up news cycle. This week we bring you snippets from five announcements this week, including news of another startup that scored millions for analytics, some big data acceleration partnerships and inventions, a Hadoop-booster story, and news from a company named this year’s analytics trendsetter.

Let’s kick off this week’s top five with news of another big data startup funding round…

Analytics Startup Strikes $20M Chord

Fidelity Growth Partners India pumped $20 million into Indian company, AbsolutData, which focuses on big data analytics services for global organizations. With this round of investment, AbsolutData says it expects to scale up its global delivery footprint to meet the growing demand for advanced and big data analytics.

AbsolutData has already staked claim to a number of Fortune 1000 organizations across the globe. Following the investment news, officials from the company said, “Analytics is a top priority for CXOs across the world. However, there is a shortage of skilled data scientists to analyze the growing amounts of data. AbsolutData, with its deep industry expertise and strong analytical capabilities, is well positioned to help its clients gain maximum value from their data assets. “

“We now serve some of the largest and most reputed retail, consumer goods, technology and hospitality companies, across US, Europe and Asia Pacific. This investment will enable us to further strengthen our ability to service the increasing demand from our global clients,” said Dr. Anil Kaul, CEO, AbsolutData.

As part of this transaction, Kabir Narang, Director at Fidelity Growth Partners India, joins the AbsolutData Board of Directors. Commenting on the investment, Kabir Narang said, “We had prioritized business analytics as an investment theme. This sector will continue to benefit from the dramatic increase in volume of data generated by consumer devices such as smart phones and tablets and the increase in computing power and storage capacity in organizations.”

NEXT -- Actuate and VoltDB Team to Accelerate Big Data >

Actuate and VoltDB Team to Accelerate Big Data

Actuate, “the BIRT Company” and VoltDB, which focuses on ultra-high-throughput relational database systems (RDBMSs) announced an alliance that will they claim will enable VoltDB and ActuateOne customers to more quickly and effectively process their large datasets.

VoltDB is a NewSQL database system designed for organizations that have reached the price/performance limitations of SQL databases. The company contends they can offer processing at speeds from 45 to 100 times faster than traditional databases, VoltDB emphasizes handling high-speed data applications. The claim is that it can augment big data storage by “helping to make inexpensive virtualized computing infrastructures – including cloud service platforms – work better for companies handling high velocity transactional data feeds, including log/event data, market data feeds, and social media feeds.”

ActuateOne is a suite of interactive applications and a development and deployment platform built on BIRT. The company says the operational dashboard and analytics features of ActuateOne are especially appealing to customers using VoltDB, because they combine real-time data feeds with historical data, allowing clients to see the aggregation of trends, track data and demonstrate emerging themes that erupt in real time.

“Together, VoltDB offers lightning-speed transaction processing and ActuateOne enables real-time analysis and visualization without the need for pre-processing. The combination can dramatically speed the time from Big Data access to operationalized insights that can improve the bottom line in any organization,” said Sam Berg, VP of Field Operations for VoltDB.

The companies say that the benefits of combining these two elements provide low latency, guaranteed accuracy and write throughput and the ability to scale reads and writes of large active datasets, while maintaining a relational model and SQL semantics. They also suggest that the operational and analytic dashboards allow collaborators to utilize native analytic capabilities to refine their data visualizations, providing visibility through the transactional, operational and analytical phases of their data. Further, both claim the ability to combine both real-time and historical data sources to provide the context necessary for organizations to analyze ever-evolving information.

NEXT -- Progress’ Sneak Peek at Hadoop Driver >

Progress Offers Sneak Peek at Hadoop Driver

Progress Software Corporation announced this week that it will be offering a limited-availability preview program of its latest data connectivity innovation to be added to its premium data connectivity driver portfolio. The new, high-performance DataDirect Connect XE for ODBC driver for Hadoop Hive will enable reliable, secure and highly-scalable connectivity to multiple distributions of Hadoop.

The company says that organizations are using Apache Hive data warehouse for data summarization, query and analysis. However, data comes in many forms and organizations need the ability to perform additional analysis in real-time using their existing SQL-based business intelligence and data analytics tools. This means they need a highly flexible and rock-solid connection to these tools. Without such connectivity, companies’ analysts and decision makers are locked out of the insights contained in the Hadoop-based data warehouse.

The new DataDirect Connect for ODBC driver for Hadoop will deliver the same breadth, quality, and efficiency that more than 300 OEM partners and hundreds of thousands of users of the comprehensive DataDirect data connectivity portfolio already enjoy today. They claim that for the corporate user, this driver will unlock reams of business intelligence and provide business analysts the information they need to make accurate decisions quickly and easily. For ISVs and OEMs, this connectivity now means that they can have one driver, with DataDirect quality, performance and advanced features to enable their applications to connect to most major distributions of Hadoop, including Apache Hadoop, MapR Apache Hadoop and Cloudera’s distribution of Apache Hadoop.

The DataDirect Connect for ODBC driver for Hadoop is expected to ship at the end of October 2012, with preview access available for a limited number of participants beginning in September. More information on how to get a jumpstart using this driver can be found here.

NEXT -- Bringing SAN-Free Virtualization to Hadoop >


Bringing SAN-Free Virtualization to Hadoop

Nutanix, a provider of SAN-Free data center solutions for VMware virtualized workloads, this week announced a turn-key platform that employs VMware's core virtualization technology and Fusion-io flash for cost effective virtualized Hadoop deployments that demand high IO performance and rapid capacity expansion without the need for a SAN or NAS. This news follows the Series C Funding announcement of 8/21/12.

Nutanix says their Complete Cluster collapses the conventional two-tiered datacenter design -- compute and storage -- into one scalable tier of infrastructure. Inside each appliance, Nutanix software runs on top of VMware vSphere, transforming local storage devices into a distributed virtualized cluster.

 The company says the software-based approach to storage provided by Nutanix Complete Cluster is ideal for running virtualized scale-out workloads such as Hadoop. The claim is that “Its innovative use of server-attached Fusion-io flash memory and 10GbE network speeds, delivers ultra low-latency and performance at any scale.”

 Clusters start at four nodes, but can dynamically expand to several hundred. This combination of commodity hardware, flash and software-based scale out storage offers a compelling solution for some deployments that demand high-end Hadoop performance optimized for virtualized datacenters.

"Nutanix Complete Cluster software-based storage capabilities enable a flexible deployment model ideal for running Hadoop workloads on VMware vSphere virtualized workloads," said Parag Patel, vice president, Global Strategic Alliances, VMware.

"In order for Hadoop to thrive in the private cloud, it must borrow from the scale-out converged architectures prevalent in the public cloud -- virtualized, elastic, and with multi-tiered storage. To succeed, vendors must address this problem for the mass market with a software-defined approach delivered on off-the-shelf hardware," said Merv Adrian, a Vice President at Gartner.

"In virtualized Hadoop environments, low-latency access to data is the key to successful deployment, and the Nutanix Complete Cluster successfully leverages the power of VMware and Fusion ioMemory to realize highly scalable compute performance, while also offering significant value through savings on capital and operational expenditures associated with the rapid scale-out of traditional storage arrays," said Tyler Smith, Vice President of Alliances, Fusion-io.

NEXT -- SAS Pegged as Big Data Trendsetters >

SAS Pegged as Predictive Analytics, Data Mining Trendsetters

This year, SAS predictive analytics and data mining earned the KMWorld 2012 Trend-Setting Product of the Year Award.

SAS recently expanded its family of high-performance analytics, which they say is helping customers create new value from big data - including text - to make better decisions within tight time frames. Among actual users is job search giant, Monster.

“Using SAS Analytics, Monster Worldwide, parent company of – the premier global online employment solution for people seeking jobs and the employers who need great people – creates and deploys predictive models that boost job postings performance,” said Jean-Paul Isson, Global Vice President of BI and Predictive Analytics, Monster Worldwide. He said, “Leveraging SAS we marry external and internal data to deliver customer intelligence that makes our clients’ recruitment more effective. We saw sharp improvements. With global scoring and optimization models, our average order size increased 24 percent and retention jumped 17 percent. Innovation through analytics fuels those successes.”

SAS says that the key components behind their push for high performance analytics include:

  • Exploratory Data Analysis with dynamic visualization; advanced statistical techniques and core data mining capabilities to quickly identify relationships and opportunities.
  • Text mining to reveal new insights from documents to enhance predictions.
  • Model development and deployment to create highly accurate descriptive and predictive analytic models based on large volumes of data.
  • Analytical acceleration for faster results and improved data governance with in-database analytics.
  • Scoring acceleration to maximize the performance and accuracy of analytic models.

Share Options


» Subscribe to our weekly e-newsletter


There are 0 discussion items posted.


Most Read Features

Most Read News

Most Read This Just In

Sponsored Whitepapers

Planning Your Dashboard Project

02/01/2014 | iDashboards

Achieve your dashboard initiative goals by paving a path for success. A strategic plan helps you focus on the right key performance indicators and ensures your dashboards are effective. Learn how your organization can excel by planning out your dashboard project with our proven step-by-step process. This informational whitepaper will outline the benefits of well-thought dashboards, simplify the dashboard planning process, help avoid implementation challenges, and assist in a establishing a post deployment strategy.

Download this Whitepaper...

Slicing the Big Data Analytics Stack

11/26/2013 | HP, Mellanox, Revolution Analytics, SAS, Teradata

This special report provides an in-depth view into a series of technical tools and capabilities that are powering the next generation of big data analytics. Used properly, these tools provide increased insight, the possibility for new discoveries, and the ability to make quantitative decisions based on actual operational intelligence.

Download this Whitepaper...

View the White Paper Library

Sponsored Multimedia

Webinar: Powering Research with Knowledge Discovery & Data Mining (KDD)

Watch this webinar and learn how to develop “future-proof” advanced computing/storage technology solutions to easily manage large, shared compute resources and very large volumes of data. Focus on the research and the application results, not system and data management.

View Multimedia

Video: Using Eureqa to Uncover Mathematical Patterns Hidden in Your Data

Eureqa is like having an army of scientists working to unravel the fundamental equations hidden deep within your data. Eureqa’s algorithms identify what’s important and what’s not, enabling you to model, predict, and optimize what you care about like never before. Watch the video and learn how Eureqa can help you discover the hidden equations in your data.

View Multimedia

More Multimedia


Job Bank

Datanami Conferences Ad

Featured Events

May 5-11, 2014
Big Data Week Atlanta
Atlanta, GA
United States

May 29-30, 2014
St. Louis, MO
United States

June 10-12, 2014
Big Data Expo
New York, NY
United States

June 18-18, 2014
Women in Advanced Computing Summit (WiAC ’14)
Philadelphia, PA
United States

June 22-26, 2014

» View/Search Events

» Post an Event