Salesforce Analytics Cloud: Now with More Hadoop
Salesforce customers who use the vendor’s hosted Wave analytics platform now have an easier way to access data from their on-premise Hadoop clusters and cloud-based big data repositories. Today’s launch of Salesforce Wave for Big Data brings support for systems from Cloudera, Hortonworks, Google, and New Relic, among others.
In October, Salesforce launched its Wave Analytics Cloud, a hosted analytics platform designed to give non-technical users in sales, marketing, and service positions access to information that improves sales and boosts customer satisfaction and retention. The NoSQL-based system garnered positive reviews, especially for its self-service approach and the schema-less way it ingests data.
Despite the years of development effort that Salesforce put into Wave, there was no way it was going to cover all the analytic and BI bases. (After all, no first-gen products ever satisfy all requirements.) One of the gaps in Wave was the way users brought in data from external data repositories, specifically Hadoop and similar offerings like Google‘s Bigtable. While Wave went to market with the “self-service” tag, uploading data from Hadoop into Wave typically required complex ETL processes managed by specialized analysts and programmers in IT.
Salesforce addressed this functionality gap yesterday with Salesforce Wave for Big Data. While the on-prem Hadoop data still needs to be uploaded into the Salesforce cloud to be accessible by Wave users (and the data may or may not need some cleaning and transforming before it’s useful or usable), the vendor and its partners have automated some of the work required to extract and load that data from Hadoop into Wave.
Salesforce says it delivered “native” by way of a Java program that installs on the Hadoop cluster and pushes data up to the cloud. Keith Bigelow, the senior vice president and general manager of the Salesforce Analytics Cloud, said it’s about connecting “the last mile.”
“[I]f you look at the state of the market today, while we have these incredible big data platform that are used by IT, by developers, by data scientists, delivering that last mile to the sales person, to the service person, to the marketer…that’s been our real challenge,” he said during a conference call yesterday. “What if everyone could gain that value from big data for every interaction they have with their customers? That’s precisely what we’re announcing today.”
The Java program that Salesforce developed with its partners is ready to help customers tap into the data they have sitting in Hadoop clusters from Cloudera and Hortonworks, Bigelow said. Two unnamed Wave customers are already in production with this connector, he said. Salesforce is still working on a “expanded product integration” with Google, although there is an integration method available now.
The $5-billion CRM giant is also working with data transformation software providers Informatica and Trifacta and hosted analytics software provider New Relic to bring external data available to Wave users. Instead of cleaning big data by hand, which is extremely time-consuming and tedious, Wave customers can tap into the pre-built data transformation libraries that Informatica and Trifacata have built, enabling data to be cleansed natively on Hadoop before being uploaded to the Wave cloud for analysis.
Trifacta’s CEO Adam Wilson sees the work his company is doing with Salesforce Wave as an indication of the democratization of big data analytics. “This closely maps to a growing trend we’re seeing in the big data space where the initial group of technical users is expanding to include a new class of data-driven business professionals,” Wilson says. “When diverse data from Hadoop is brought to the Salesforce Analytics Cloud, the insights available to these users grows exponentially.”
In a demo, the senior director of product marketing for Salesforce Wave Analytics Cloud, Anna Rosenman, showed how Trifacta’s software can be used to clean and prep raw data for analysis on Wave. “Bringing big data into the hands of BI users is not something that’s easy to do,” Rosenman told Datanami. “We’re trying to make it formal and easier and better and faster.”‘