Follow Datanami:
August 25, 2015

Five Reasons for Leaving Your Data Where It Is

Markus Rex

Not a week goes by without a major story about vulnerable data in the hands of the wrong people. Whether it’s a data breech at a government agency or global retailer exposing financial information, cloud service providers like Microsoft, Google and Amazon battling government agencies over access to data, or potential HIPAA violations resulting from compromised health data, we are constantly reminded that, unfortunately, people are after our data – and they’ve gotten pretty good at getting it.

Despite this, cloud adoption is exploding, with businesses increasingly trusting third parties with their sensitive data. According to Forrester Research, the public cloud market is estimated to reach $191B by 2020, growing significantly from 2013’s market size of $58B. That’s more than the GDP of countries including New Zealand, Hungary or Vietnam. The trends are clear as day – cloud adoption and cloud risk are both on the rise. There’s a reason cloud vendors are so quick to trumpet their security and privacy capabilities.

As cloud adoption continues its meteoric ascent, the number of access points to enterprise data also increases. And while keeping data on premise is no guarantee of security, it does drastically reduce the number of access points. But the case for keeping data on premise goes far beyond a common sense security and risk mitigation play that tells us the fewer servers our data runs through, the safer it is.

Sure, security and privacy are the big, bright red flags. But there are other, less obvious reasons for businesses to keep their data where it is – on their own servers.  Let’s take a look at five important considerations:

1. You Already Invested In Infrastructure

data center_1

Many IT departments have spent years or even decades building infrastructures secure, powerful and agile enough to run their business. Hardware spending is down, and spending on IT services are up. Cloud computing has caused a tidal shift in the industry, significantly reducing the use of proprietary infrastructure as IT administrators increasingly outsource services.

Instead of looking for solutions that ditch that infrastructure, organizations that can identify solutions that leverage the infrastructure (and the large investment that built it) will not only use their resources more efficiently, but will incur less risk of turning their IT infrastructure into a ghost town incapable of supporting future projects as business needs evolve.

2. Your Business Processes Are Not Broken

If you think technology infrastructure is an asset, it’s a drop in the bucket compared to an organization’s business processes. Business processes are a huge component of most companies’ differentiation – the fundamental building blocks on which companies are constructed. An organization is only as strong as its business processes, and those processes often depend on having files and data in a certain place.

Wholesale changes to business processes and user workflow – even just an extra step like uploading to or downloading from Dropbox – can disrupt the processes your company and your employees rely on, and adversely affect your business. At the same time, the cloud brings the added risk of storing data and process logic with a third party, creating additional access points for competitors or other nefarious characters that might be after your data.

3. Cloud Pricing No Sure Thing

Since the dawn of the Cloud Era, cost saving has been the rally cry and calling card – but taking the monthly statementprevailing logic as gospel is irresponsible, and potentially costly. Sure, the cloud helps reduce IT costs in most cases, but the promise of extreme TCO reduction has been oversold and under-delivered. The other aspect is the cross-funding of free users if the cloud service has a free entry/offer.  “Premium” users end up paying to cover those free users.  The cloud is not a cost panacea. It can be a life saver for smaller enterprises or new businesses that have nothing already built, but for enterprises with thousands of users, subscribing to cloud-based services – and the associated bandwidth requirements – can be economically dicey.

From the waste of overprovisioning cloud services to inflated migration costs, the move to the cloud is not quite the Holy Grail we’ve been led to believe. Selecting the wrong initial storage level or server price level can have long-lasting budget ramifications. And while the process of moving data from point A to point B is cost-effective in the cloud, the process of extracting that data can consume significant resources and budget if not managed closely.

4. Data Rules Are Tightening

Depending on the industry you’re in, the physical location of your data may make or break your regulatory compliance standing. In the wake of the PRISM and data surveillance/collection programs from European government agencies (not to mention hundreds of breaches at the hands of malicious attackers), many foreign regulatory bodies have passed legislation tightening controls on data. The forthcoming General Data Protection Regulation, which will unify European data privacy and protection laws, will have real teeth, and will make organizations much more accountable than before in the event of a breech or data leak. PIPEDA in Canada applies to all organizations engaged in commercial activities and establishes rules governing the transfer of personal information to an organization in another jurisdiction for processing and ensuring the accountability for the information.

Regulations across the globe are evolving quickly – and with that evolution comes a great deal of uncertainty. What’s compliant today may not be tomorrow or five years from now with the shifting sands of regulatory compliance – a risk most enterprises cannot afford to take.

5. Let Your IT Department Do Its Job

Since the introduction of the PC, IT departments have served as a hub for innovation within organizations. Departments have been tasked with creating and supporting environments where technology can enable and improve the business. A wholesale move to the cloud can have the unfortunate effect of neutering your IT department, demonstrating a lack of faith and diluting the caliber of talent responsible for keeping you up and running.data cener and IT dude

Instead of undercutting IT, empower them to make the decisions about when to leave data where it is or when to move it to the cloud. Let IT do its job: don’t try to do it for them by forcing a cloud move. They will know where it makes sense, and can help determine the appropriate balance. This can empower IT to become enablers of a modern enterprise.

Despite these compelling reasons to keep data where it is, no IT administrator is going to ignore the cloud completely. When you have the option to keep data on-premise or move it, you need to consider the business processes, existing infrastructure, policies and procedures and cost impact before deciding which strategy makes sense for your business. Any CIO that’s not considering some use of the cloud is lacking the innovative inclination that got them to their current position. But rather than a wholesale rip and replace with the cloud, IT departments should take into account the whole cost and risk picture, and leverage the cloud where it makes sense.

Cloud computing has changed the face of IT, in many cases for the better. When executed correctly, it can be an excellent way for beleaguered IT departments to save some money – but even saving money comes with a cost. Privacy, security, control, visibility, continuity and innovation can all take a back seat when organizations cede all of their data to a cloud service provider. Organizations must take a hard look beyond.

 

Markus Headshot, JacketAbout the author: With nearly 20 years of open source experience behind him, Markus Rex founded ownCloud, Inc. with community leader Frank Karlitschek and long-time SUSE colleague Holger Dyroff, determined to bring secure file sync and share to business. As CEO, Markus  is responsible for all day-to-day management, yet never misses a chance to go under the
ownCloud hood and tinker with fixes and QA. Prior to ownCloud, Markus served as senior vice president and general manager of Novell’s SUSE Linux Open Platform Solutions business unit. Previously he held the position of CTO for the Linux Foundation. He has also served as Novell’s CTO for Linux and in various engineering management roles at SUSE. Markus is a graduate of the Harvard Business School General Management Program and attended Erlangen University.

Related Items:

Hybrid Analytics Yield Big Data Flexibility

Tracking the Rapid Rise in Cloud Data

 

 

 

Datanami