Follow BigDATAwire:

May 7, 2020

The Big Cloud Data Boom Gets Even Bigger, Thanks to COVID-19

(Valery-Brozhinsky/Shutterstock)

Public clouds were gaining momentum before COVID-19. Now, nearly eight weeks into the pandemic and associated shutdown of work and play, clouds are gaining even more ground as companies scramble to virtualize as much of their operations as possible, including data storage and analytics workloads.

For a glimpse of how the public cloud platforms have been impacted by the spread of coronavirus, let’s take a look at recent financial results from the Big 3. Many states implemented COVID-19 lockdowns in the third week of March, before the close of the quarters.

The Amazon Web Services (AWS) segment of ecommerce giant Amazon brough in $10.2 billion in revenue during its first quarter ended March 31. That was up from $7.7 billion a year earlier, a 33% increase.

Google Cloud reported $2.8 billion in revenue during its first quarter ended March 31, up from $1.8 billion in the same quarter last year, an increase of 34%.

Microsoft reported a 59% increase in revenue from Azure, which is part of its Intelligent Cloud segment. Microsoft doesn’t break out specific Azure figures, but Intelligent Cloud brought in $12.3 billion in revenue for its third quarter ended March 31.

“If you step back and ask yourself, say, two years from now, is there going be more being done in the public cloud or hybrid cloud or less? The answer is more,” Microsoft CEO Satya Nadella said during a conference call last week.

The Big 3 cloud providers already had considerable momentum going into the lockdown, and that momentum has increased substantially since, according to industry experts.

(OSORIOartist/Shutterstock)

“Anytime there’s a recession, there’s a flight to efficiency,” says Ashish Thusoo, the co-founder and CEO of Qubole, a big data as a service (BDaaS) firm. “Cloud is a good example of that. Now more than ever companies cannot think of putting together large data teams to run and operate an infrastructure when they can easily consume it from a cloud service.”

Qubole’s BDaaS offering runs on all three public clouds (as well as Oracle Cloud), and lets customers bring a variety of data processing engines, including things like Apache Spark, Presto, and TensorFlow, to bear on the data they have stored in the lake. Customers can get these services directly from the cloud vendors, of course, but the Qubole service provides a level of processing efficiency and cost savings in the cloud that are typically available without significant infrastructure expertise, according to Thusoo, who previously ran Facebook’s data infrastructure team and was a co-creator of Apache Hive.

“Generally COVID will cause a lot of restructuring to move to an online world. In those areas, a lot of innovation will continue to happen,” Thusoo says. “Look at traditional industries. Oil and gas is hurting big time right now. Why won’t they look at cloud as mechanism to…get the analytics faster and at a much lower cost than having to run their own data center and then having to run their own data teams? I think it will be a flight to efficiency, and cloud will be a more efficient model.”

The public clouds have become so ubiquitous that even conventional storage vendors are changing their business models to account for it. For instance, NetApp is best known as a provider of speedy Flash-based network attach storage (NAS) hardware. But over the past few years, NetApp has shifted its business model to focus on providing virtual data services in the cloud.

“Enterprises storage customers that have not dealt with NetApp in the past few years may not even recognize the vendor,” writes IDC analyst Eric Burgener in a brief that’s available on the NetApp website.

With the recently announced Project Astra, NetApp is now aiming to be a one-stop shop for data management in the cloud. No matter on which cloud the actual underlying data sits, NetApp, via Astra, will expose the data to clients and ensure that the data is sitting in the most cost-effective tier. By supporting Kubernetes, NetApp will ensure that customers’ applications can access the data, even if it’s sitting in multiple clouds or on-prem data centers.

(kkssr/Shutterstock)

Container technology, and Kubernetes specifically, have emerged as key pieces of technology giving companies the confidence to grow their cloud investments. A recent survey from data virtualization firm Denodo found that the use of container technologies was up by 50% from 2019. It found that Docker was used by 46% of the people who took the survey, followed by Kubernetes at 40%.

“As data’s center of gravity shifts to the cloud, hybrid cloud and multi-cloud architectures are becoming the basis of data management,” said Ravi Shankar, senior vice president and chief marketing officer of Denodo. However, Shankar noted that the challenge of integrating data in the cloud has increased significantly.

A recent survey by Technology Business Research indicates a 13% increase in the number of companies that are increasing their cloud spending as a result of COVID-19, from 17% in March to 30% two weeks later. “SaaS and IaaS are among some of the few IT segments that may see increased demand in the first half of this year,” wrote Angela Lambert, an engagement manager and senior analyst with TBR.

Things changed quickly in mid-March. With millions out of work and told to stay home, businesses quickly pivoted to an online presence. In the big data space, data was already migrating to the cloud, and COVID-19 gives companies a reason to increase the pace of those migrations.

Related Items:

Cloud Looms Large for Big Data in 2020

Cloud Now Default Platform for Databases, Gartner Says

As Cloud Grows, Is Resistance to AWS Futile?

BigDATAwire