Follow Datanami:
February 7, 2019

Let the Data Not the App Drive Your Multi-Cloud Strategy

Chadd Kenney

(CoreDESIGN/Shutterstock)

Artificial intelligence, machine learning, and analytics aren’t just technological trends. Data-driven innovation and decision-making is the future for business. There’s no longer any question.

Yet the volume and variety of data is overwhelming. Transaction data, IoT devices and sensors, and video and audio data from a multitude of sources are generating massive data sets. Global IP traffic is now measured in zettabytes, and the deluge is rising fast.

According to a survey by MIT Technology Review and Pure Storage, the vast majority of business leaders say data is the foundation for making decisions, delivering results for customers, and growing the business. Those same executives say they face challenges in digesting, analyzing and interpreting it all. And managing it is expensive.

In this new world, application and data mobility are key to driving efficiency and business advantage. In the old days, specific applications dictated infrastructure needs, but in an era of exponential data growth, use of that data – and accessibility to it – should be the driving factor in any data strategy.

Modern businesses need real-time access to any and all data. That means making the most of business data in the reality of a multi-cloud environment – enabling applications to move freely between on-premises, private, or public cloud. The right strategy is on-prem and cloud, not either or.

Why is this so important? According to IDG, 90% of companies will have a portion of their applications or infrastructure in the cloud this year. Among the many lessons learned in that migration over the last several years is the need to drive efficiency and cost savings while meeting strategic business needs. That means setting data free, but doing so in the most efficient manner possible, which in many cases is vastly different than operations today.

The bottom line: executives should be making strategic decisions about the appropriate environment based on the type of data and the applications making use of that data. For example, applications that typically run consistently day after day and week after week — mission critical, steady state apps that run all the time without a lot variance —  are better-suited to an on-prem instance. It’s simply less expensive than running such apps persistently in the cloud.

(Valery Brozhinsky/Shutterstock)

But workloads that typically must spin up or down with some frequency — and which require lots of compute — are better-suited for the public cloud, where they can take advantage of cloud economics and pay only for the time they’re actually being used.

As  executives begin the shift to making strategic decisions based on the type of data rather than type of application, one way to accomplish it is to view the classic concept of application tiering in a new way.

Tiering, Evolved

Most enterprises run a mix of workloads. The concept of tiering has evolved with the advent of fast, flash-based storage. The idea that you have mission critical Tier 1 applications on high-performance storage, Tier 2 applications on mid-performance storage and Tier 3 applications on cold storage based on economics is out of date. Flash has democratized the data center and enterprises realize there is value in all of their data.  In other words, “cold data” is no longer.

Modern data centricity means that application mobility – the ability to seamlessly move applications born in the cloud to an on-premises environment, or vice versa, based on needs the data dictate — is what is mission critical.

Data mobility across public and private cloud requires a common tier of shared data. An Oracle database, for example, might run on Tier 1 storage but the data in the database might be leveraged in many places within the enterprise depending on the use case.

Because it’s business critical, it should be running at all times with high resiliency and low latency storage on premises. Yet periodic analytics reports or an intelligent algorithm that runs for end-of-month reporting are good candidates for the cloud.  The agility of the cloud allows you to not need this massive compute on-premises, as you can spin up the compute you need quickly for the period of time you need it.  Need a report faster, spin up more compute.  Having a common data layer makes data mobile and applications agile.

Data Centricity As  Bridge in Multi-Cloud World

Today, a cloud divide persists – the on-prem and hosted environment and the public cloud with different management and consumption experiences, different application architectures, and different storage.

(Semisatch/Shutterstock)

What if you could bridge the two with seamless orchestration, bi-directional mobility, and common shared data services? Hybrid cloud requires a data centric architecture, which is built at its core to share data in real time and easily facilitates moving data and applications.

Enterprises need this data centricity in order to be more competitive in the market – it enables a much more simplistic way of deploying overall services for IT and provides the real-time access to data required to gain more intelligence and make faster decisions.

Consider the example of that Oracle OLTP database instance running on-premises. Being able to send copies of the data out to the public cloud gives enterprises a huge advantage of scale with an agile compute architecture. By having that same common data layer, you’re taking that same on-prem data centric architecture and extending the experience into the public cloud.

Similarly, native cloud app deployments can be enhanced by getting data services that increase efficiency through data reduction, snapshot, and replication facilities. Indeed, your data should even define your disaster recovery strategy.  In a multi-cloud world, data centricity enables you to leverage the cloud as a second data center. In an on-prem environment, spinning up the cloud-hosted backup environment makes recovery much less painful.

That’s how you liberate applications, unify cloud, and manage data across the data center and cloud.

About the author: Chadd Kenney serves as the chief technical officer of the Americas at Pure Storage, where he is a senior technical advisor to engineering, product management, sales, marketing and presales leadership in the field. Chadd joined Pure Storage in 2012 as a founding member of the technical sales staff, assisting in the development of the go-to-market strategy and creation of the sales force to disrupt the storage industry. Previously, Chadd was the Field Chief Technical Officer and Divisional Systems Engineering Director at EMC Corporation. Chadd has a Bachelors of Science in Computer Information Systems and Accounting from San Francisco State University, College of Business.

Related Items:

‘Open Hybrid’ Initiative Targets Big Data Workloads

What’s Driving the Cloud Data Warehouse Explosion?

The Top Three Challenges of Moving Data to the Cloud

Datanami