Follow Datanami:
April 2, 2014

Pivotal: Say No to Hadoop ‘Tax’ with Per-Core Pricing

Alex Woodie

Pivotal today unveiled its Big Data Suite, a collection of its big data software products that includes the Greenplum database and Apache Hadoop. While the stack doesn’t introduce any new products or technologies, it does bring a vastly simplified licensing model, including per-core pricing for all the point products in the suite, which the company says will allow customers to store an unlimited amount of data in Hadoop.

Pivotal unveiled the Big Data Suite–which combines Greenplum, PivotalHD, HAWQ, GemFire XD, GemFire, and SQLFire–to help simplify how customers adopt and implement Pivotal’s vision of a “big data lake,” says Michael Cucchi, senior director of product marketing for Pivotal.

“Customers saw our big data lake and said ‘Yes I want that! Where do I get started?’ The truth is, it’s complicated,” Cucchi tells Datanami. “It’s different for every customer. It depends where they are in maturity phase, what investments they have made in data management. What are their skill sets? What are their people trained to do? It’s really very different for each individual enterprise.”

Different technologies are brought to bear depending on the use case. If one part of a business needs real-time analytic capabilities, they may opt for Pivotal’s GemFire XD, which delivers in-memory SQL queries over HDFS. If another department needs interactive processing on massive data sets, but can tolerate waiting a few seconds, they may opt for Pivotal’s massively parallel Greenplum database. Still other analytic projects that handle even bigger data sets but are less time-critical batch oriented, may be fine to implement using MapReduce on top of HDFS.

Trying to come up with a product purchasing plan that would fulfill the big data lake dream was an exercise in complexity. For starters, the customers might not know which of the products they might need. And once they figured out which products to buy, customers struggled to figure how much of each product to buy. The fact that the different products had different licensing metrics only confounded matters (although, to be honest, it’s par for the course in enterprise software scenarios, as any IBM or Oracle customer can tell you).

Pivotal’s Big Data Suite

The folks at Pivotal took a step back and decided they could simplify the whole purchasing and sizing exercise by bundling a collection of the most commonly used technologies–hence the Big Data Suite–and selling it with a single metric: per-core pricing.

The new subscription-based, per-core model may be the first time that a major Hadoop distributor is not charging based on the amount of data their customers store. “If you look at all the Hadoop vendors, they price either by terabytes or by node, which translates directly into how many terabytes they need to store,” Cucchi says. “So we’re out there telling all our customers, you’ll be a better person, you’ll be more competitive and efficient when you can leverage big data, so you should really store everything. And then we’re sitting back and taxing them for every extra terabyte they store.

Of course, customers who dramatically increase their data are going to need more Hadoop nodes with more Intel Xeon cores to store that data on, so Pivotal does benefit from its customers’ big data hoarding at the end of the day. But one could argue that Pivotal’s approach is more democratic, especially when it comes to the way that it’s allowing customers to mix and match the use of different products under a single license for a given number of nodes in a cluster.

“Customers can dynamically re-allocate licenses at any time without letting us know. They don’t have to call us,” Cucchi says. “This is basically saying, We want customers to partner with us, we want to become their partner for becoming their data management provider.  If they invest in us in a subscription model, we’re going to invest in them and let them grow their data size indefinitely on an enterprise Hadoop solution.”

Pivotal declined to share exact pricing figures, except to say that there is a minimum purchase amount for the two- and three-year subscriptions.

Related Items:

Pivotal Refreshes Hadoop Offering, Adds In-Memory Processing

Pivotal Helps NYSE with Multi-Petabyte Problem

Can the Internet of Things Help Us Avoid Disasters?

Datanami