Follow Datanami:
March 31, 2020

Crafting a Hybrid Cloud Backup Strategy

(Roman Rybaleov/Shutterstock)

The rise of the cloud has changed the relationship between companies and their data. With a wide mix of applications and data in the cloud, on premise, and in between, keeping track of everything can become a giant chore. This rapid change requires the careful attention of IT architects, particularly when crafting a strategy to back up one’s data in a hybrid environment.

One of the biggest mistakes that customers can make when moving applications and data to the cloud is assuming that the cloud vendor is going to back up the data. However, according to a 2017 survey conducted by data backup and recovery provider Veritas, more than 40% of users said that data protection was the responsibility of the cloud provider.

“In some cases, they wrongfully assume that it’s just happening natively in the cloud,” says Alex Sakaguchi, senior director of global cloud solutions at Veritas. “If you look at the EULA [end user license agreement] from the big cloud providers, they pretty specifically call out that protecting the data and ensuring its recoverability is in fact the customers’ responsibility.”

All of the cloud providers provide a base level of data resiliency for data stored in their platforms, such as by using a distributed object storage system like Amazon Web Services’ S3, which uses erasure encoding to protect data from the failure of a single server or node. Microsoft Azure and Google Cloud both offer similar protection through their object storage systems.

But don’t confuse that that bare level of storage resiliency with a deliberately crafted backup and recovery solution, says Anthony Cusimano, cloud solutions marketing manager for Veritas.

(Shutterstock)

“Folks assume, ‘Oh, it’s erasure coded. It’s in 43 locations and it has three parity nodes. It’s fine. I’ll never lose that data,’” Cusimano says. “But if they don’t have a point they can get back to, they’re done. They’re essentially hosed as far as their recovery strategy is concerned, because once that block is messed up and it’s replicated to all 42 other blocks, unless you have a secondary point-in-time contingency plan, there’s no way to get back.”

The good news is that all of the cloud providers offer data backup and recovery services to ensure their customers data is protected. These backup and recovery services are integrated with many of the applications and services that AWS, Microsoft Azure, and Google Cloud offer to their clients. Third-party applications offered through the cloud providers’ marketplaces are also usually integrated with the cloud providers backup and recovery services.

The challenge comes when customers run their own applications on the cloud or run applications in multiple clouds. They can choose to use the cloud providers backup and recovery offerings for each of these applications, or they can select another backup and recovery offering that is integrated with the cloud and marketplace offerings, such as Veritas NetBackup, which is offered in the marketplaces of all the clouds.

“Microsoft has their own version of backup. AWS has their own versions of backup,” Sakaguchi tells Datanami. “In some cases, they’re rudimentary in terms of capabilities they provide. But they’re also exclusive to their cloud platform. If you’re an enterprise that’s working across multiple clouds, that can quickly be a complex mess to deal with.”

Tape backup is still relied upon on-prem (Full_chok/Shutterstock)

Some larger enterprises are relying on Veritas to craft separate backup and recovery strategies for their cloud, hybrid cloud, and multi-cloud environments. In these cases, the fact that the Veritas NetBackup environment is separate from their AWS, Google Cloud, and Microsoft Azure environments is a key advantage, Cusimano says.

“It does make sense to abstract away from a company and instead leverage multiple locations outside of a single vendor,” Cusimano says. “It makes sense to have a secondary location outside of the cloud, just the same way we used to say your data center isn’t safe. If your data center can go down, then any data center can go down, so it always makes sense to have a separate location.”

There are many different approaches that customers are trying with their cloud backups. They’re backing up their cloud applications to on-premise media servers, as well as backing up their on-premise applications to the cloud.

If the data is already in the cloud, it often makes sense to keep the data in the cloud. When it comes to backing up from on-prem to the cloud, customers will often start small by backing up their newer data sets or applications, while using existing backup methodologies for existing apps and data sets. (If an enterprise has already purchased a tape drive or library, for example, that’s a sunk cost and customers would be wise to maximize their use of that asset.)

(Phonlamai Photo/Shutterstock)

The downside of having a separate backup solution is that customer must pay to move the data from the cloud to their backup location. It’s not uncommon for cloud vendors to charge three times as much to move data out of the cloud as it costs them to move it in. That’s all part of the complex equation that customers must deal with when crafting a backup solution.

The cloud providers want to keep customers from exploring their off-cloud alternatives, Cusimano says. “They want to make sure you’re locked in and you’re not depending on any third-party cloud hosted solutions because once again they want to keep everything local,” he says. “They don’t want you to be exploring outside their field of view. So even when you deploy our solution, its’ the Am version of NetBackup.  You can take our solution and communicate with other platforms. But it’s going to be running in Am on EC2 instance using S3 storage.  You’re just communicating outside of that end point.”

The disadvantage of working exclusively with Microsoft and Amazon is that the services appear to be “all wrapped under a homogeneous blanket, where it looks like you’re just interfacing with Microsoft, even thou there could be 8 different data centers your applications are actually threshholded across,” Cusimano says. “If there is an issue at the Microsoft level, you’re just hosed regardless.”

Increasingly, Veritas is being asked to craft multi-cloud and hybrid-cloud strategies to provide the utmost in data protection. The computing world was reminded of the fallibility of public clouds in 2018, when AWS suffered a massive outage in its Reston, Virginia data center that caused a ripple effect across the industry. That event may have slowed the migration of data and applications to the cloud, and it also may have caused enterprises to think twice about putting all of their eggs in one cloud basket.

Related Items:

Three Surprising Ways Archiving Data Can Save Serious Money

Backing Up Big Data? Chances Are You’re Doing It Wrong

Big Data Begets Big Storage

 

Datanami