Follow Datanami:
March 5, 2019

Google Touts Stable Cloud Storage Prices

Thomas Kurian, who succeeded Diane Greene last November as chief of Google Cloud, has been outspoken about plans to compete more aggressively in the cutthroat public cloud market. Among his first moves was announcing plans to invest $13 billion in this year in U.S. datacenter expansion.

That was followed this week by a new fixed-price cloud storage scheme that allows customers to use as much data storage as needed in return for a commitment to spend at least $10,000 monthly for 12 months. Customers “can grow stored data, with no extra charges for usage over [their] commitment, during those 12 months,” Google Cloud announced in a blog post on Tuesday (March 5).

The shift to “ready-when-you-need-it” data storage appears to be the latest twist in the public cloud price wars between Google, Amazon Web Services and Microsoft Azure. Now, Google Cloud is adding a “scalable” data plan as a path to consolidated storage on a managed platform along with a price break based on an upfront customer commitment.

Google is also promoting “unified object storage” as a better option for data storage consolidation and resolving capacity issues, with object storage allowing customers to store and move data as needed. Google’s pitch targets high-end object as well as “low-frequency” backup and archival storage.

The service also includes several data transfer and streaming tools, including the ability to move data from AWS Simple Cloud Storage Service to Google Cloud. The new pricing plan charges on a gigabyte-per-month basis. Once there, users could integrate data with analytics services like BigQuery and CloudML.

After one year, renewal options include committing for an additional year with pricing based on peak usage. Those not renewing would be on the hook for the previous year’s overages.

A recent survey found that 24 percent of enterprise multi-cloud adopters use Google Cloud in combination with the AWS public cloud.

Google Cloud has gained cloud momentum through BigQuery, which it introduced in 2011. The amount of data stored in BigQuery doubled between 2016 and 2017, while overall Google Cloud usage increased four-fold.

The cloud storage vendor said this week it is responding to unpredictable data growth and use cases. For example, it noted that a “legacy image archive might become relevant again” for image API training “or an analytics workload might only sit in hot storage for a month,” Google noted.

Hence, the storage plan allows users to shift data “between hot and cold classes of storage and maintain cost predictability.”

The cloud vendor also said it is dropping prices for redundant storage by 42 percent. That option includes “geo-redundancy” in which the lowest tier of cloud storage access is protected from datacenter failure by storing another copy in a different region at least 100 miles away.

Recent items:

Google Doubles Down on Cloud Data Migration

What’s Driving the Cloud Data Warehouse Explosion?

Google’s BigQuery Gaining Steam As Cloud Warehouse Wars Heat Up

 

 

Datanami