Follow Datanami:
October 27, 2022

Data in the Age of Stagflation: How Enterprises Can Do More with Less

Andrew Smith


Global data growth is exploding, accelerated by the growing number of people communicating, working, learning, and entertaining themselves from home and in distributed work environments. Let’s take a look at some numbers. According to analyst firm IDC, organizations experience an average of 30% data growth annually. When we look at just unstructured data, the growth rate jumps up to 50% each year, according to Gartner. Furthermore, Gartner predicts that by 2026, large enterprises will triple their unstructured data capacity stored as file or object storage on-premises, at the edge, or in the public cloud, compared to 2022

The fact is that enterprises are creating and storing more data than ever before due to ongoing pressure to digitize operations and workflows, generate more metrics to make data-driven business decisions, and retain data for longer periods of time due to increasingly stringent security and privacy regulations. However, today’s organizations face a challenging dichotomy between exponential data growth and relatively stagnant IT budgets. Managing this is a significant challenge, and as a result, many IT teams must learn how to “do more with less” when it comes to storing and analyzing data and maintaining competitive advantage in a digital world. Let’s explore some of the ways we see organizations addressing this challenge.

Reduce Technical Redundancies, Embrace Subscription-based Infrastructure Services

When faced with a limited budget, it’s critical for IT teams to first take a step back and identify any existing technical redundancies and inefficiencies.

Automating the deployment of baseline infrastructure and data management tasks – such as virtual machine provisioning, data compression, deduplication, encryption, high availability, etc. – can help save a significant amount of administrative time when it comes to the deployment of baseline compute and storage resources. This has the potential to free up IT staff to help support additional business initiatives, like implementing a data analytics tool, building in self-service access for users, or streamlining access to the data sources that rely on this infrastructure.


IT teams can also create more flexibility in their budgets by taking advantage of subscription-based infrastructure purchases and services. Organizations have been doing this for years in order to transition their infrastructure spending from a capital expense to an operational expense. In addition to this financial benefit, there are also cost-efficiency benefits gained from this model. Most subscription-based infrastructure services can scale up or down with usage. So, if an organization needs more storage capacity or compute power, they can provision and pay for it. When they use less storage capacity or compute power, their bill will reflect this lower usage. This “consumption-based” billing can have its challenges – specifically when it comes to accurate forecasting of usage over the long-term, or predicting usage spikes – but it can eliminate the risk of over-purchasing infrastructure capacity which would otherwise sit idle.

Ensure Consistent Access to Data, Regardless of Location

Modern, digital organizations rely on data analytics to drive business decisions and customer engagement, and need consistent and reliable access to data in order to ensure they are working with the most accurate and up-to-date information. Over the years, we’ve learned that organizations tend to move data and applications across infrastructure environments to help balance cost, performance, and security. Providing seamless access to data across environments (on-premises, in the cloud, or at the “edge” – often referred to as “core, cloud, edge”) is no small task – but it can be a key enabler to helping organizations generate actionable insights in real-time.

Of these three locations (core, cloud, edge), cloud is increasingly becoming the de facto standard for centralizing large volumes of unstructured data so that it can remain secure, accessible, and cost-effective. In fact, according to Flexera’s state of the cloud survey, 48% of organizations’ data resides in a public cloud today, and that percentage is set to exceed a significant milestone of 50% in the next 12 months.

However, with this boom in cloud adoption comes the potential for increased complexity due to the adoption of multiple cloud providers and environments. Data management in these “hybrid cloud” and “multicloud” environments isn’t always straightforward. Ensuring that information does not become siloed in a single cloud environment and that it can be moved, managed, and secured regardless of location, remains a significant challenge for many organizations. With this in mind, IT teams should pursue an infrastructure and data management strategy that emphasizes the need to connect data across all three environments (cloud, core, and edge) in a way that is efficient and reliable.

Don’t Sleep on Security


It’s critical for organizations to remember that a single security breach or ransomware attack has the potential to make or break a data strategy, as well as cost a company hundreds of thousands of dollars. When determining where to allocate IT budget resources, it’s necessary to ensure that security tools and practices are well funded and up-to-date (keep those backups current!). Take care to follow fundamental security practices (zero-trust, “3-2-1-1” backup strategy, “air gapping”), and leverage native data protection features from your service provider – like immutability and replication – if available. Turning on some of these features may be as simple as clicking a few buttons.

When it comes to cloud services, it is also essential that organizations understand their provider’s shared responsibility model. Who is responsible for what in terms of guaranteeing and protecting infrastructure, data, and/or application availability and durability? What are the basic security, compliance, and data protection functions that the cloud services provider is expected to fulfill, and what functions is the end-user responsible for? The answers to these questions vary depending on each service provider and use case. Understanding the lines of demarcation when it comes to specific security and data management tasks helps organizations establish a baseline, identify the areas where they need to invest for additional functionality, and execute a comprehensive security strategy leveraging shared responsibility as a guiding principle.

Threats of a potential recession, continued inflation, and higher interest rates all have an effect on how organizations think about their short-term initiatives – strategic and financial. But it is imperative that businesses maintain their security and data protection capabilities in order to mitigate malicious attacks like malware and ransomware, and also help protect against accidental data loss due to human error, or an unplanned outage.

While the exact recession trajectory and its impact on the future of IT spending remains to be seen, what is certain is that IT departments will remain under pressure to harness the power of data to drive innovation, differentiation and digital transformation. This persistent pressure will continue to drive many organizations to adopt flexible, scalable, low-cost infrastructure services, not only because these services help them “do more with less,” but also because they provide the security, performance, and accessibility needed to establish a resilient data management strategy that can be integrated and deployed across core, cloud, and edge environments.

About the author: Andrew Smith is the senior manager of strategy and market intelligence at Wasabi Technologies. He previously was an analyst at IDC and a researcher at Forrester.

Related Items:

Understanding Your Options for Multi- and Hybrid-Cloud Data Management

Cloud Looms Large for Big Data in 2020

Multi-Cloud Complexity Heightens Security Threats