The Age of Data Productivity is Here
Black Swan events like extreme weather, financial crises, pandemics, or war used to be an anomaly. Today, these events occur at a surprisingly regular cadence, and such life-altering disruptions can induce extreme stress on an individual level and across society. The reverberating impact has been felt across the globe, with a resounding theme: the experience of uncertainty and drastic change are coming together on a collision course.
While high-pressure events provoke breaking points in global economies, they also occasionally serve as a catalyst for change and innovation for businesses. Faced with unprecedented uncertainty and new considerations, business leaders like myself suddenly found ourselves needing to pivot as nimbly as possible to forge a new path forward for our companies, teams, and customers. As the world went remote, business priorities and strategies shifted to accommodate hybrid work environments — which for many companies, meant moving to the cloud. The resulting uptick of cloud and SaaS tool adoption accelerated the growth of enterprise data volumes, with a McKinsey survey citing that nearly two-thirds of decision-makers say their organizations increased cloud budgets as a result of the pandemic.
Today, every company competes with data. With more data being created and collected than ever, the ability to harness the power of that data is the main differentiator to drive businesses forward. Yet there remains a disconnect between our expectations for the use of data and the ability to actually unlock the full potential of our data. The key to unlocking greater value is in data productivity, but in order to accomplish this, the enterprise needs effective tools that can help it connect and use data to deliver real-time insights.
The Data Productivity Revolution
We have reached a turning point in business. Companies must tap into the full potential of their data and close the gap between expectations and true data productivity. This means modern data teams have to deliver business-ready data faster.
First, let’s take a step back and look at how we define productivity. During the industrial revolution, productivity was considered a means to increase scale. Processes were invented to support synchronization to ramp up output. Workers were trained around machines, and these machines did the heavy-lifting. These new processes made the work of a few have a lasting impact on many.
So, how do we define productivity today? Far beyond a simple unit of measurement, productivity is forward movement; it is velocity combined with accuracy. Productivity ultimately propels us toward progress — it is the solid foundation needed to drive innovation. Our ideas of productivity have evolved over time, and even now, we approach it somewhat differently, especially in the wake of the pandemic.
When it comes to data, productivity means making data useful so we can accomplish more. Useful data is taken from its raw state and transformed and enriched with metrics so that it can effectively be used to deliver insights and accelerate progress. And for the modern data team, the ability to harness this data and help users capitalize on these nuances is what drives businesses forward.
The Real Cost of Unproductive Data
When data has not been adequately made usable to everyone, the cost to companies is steep. Recent studies show that data teams are frequently unsure of how all the data they collect is being used, and complex and slow processes are making much of that data go to waste — costing enterprises millions of dollars.
Take the rise in cloud adoption and migration — we know that data behaves differently in the cloud and as it sprawls, its accessibility and integrity are increasingly called into question. While businesses navigate around these Black Swan events, data teams may suddenly find themselves overburdened, which only magnifies the struggle to make data useful. To compensate for outdated migration and maintenance processes, they often have to dedicate hours to solve data usability challenges, costing them not just time, but their overall productivity. When data teams have to devote too much of their time to tasks like manual coding instead of the strategic work and analysis that moves the needle, it becomes impossible to be productive with data.
All of this has a very real fiscal impact across the business and diminishes the company’s ability to remain data-driven. When this happens, the repercussions span far and wide, including slower time to value and decisions made on outdated information, which creates inefficiencies in the business.
A Paradigm Shift in Technology
We are in a place where data is now creating data, leading to data chaos as it becomes increasingly difficult to work with it. Approaches to wrestle with data are out of date before we can scale them. Old habits prevent new processes from being created. And in the face of all this, our tools have let us down. Even the most advanced data engineers have been toiling away with these antiquated methods and low-level technologies. And the fact is there just aren’t enough people equipped to work with data, so generating more of it does us little good.
But if we could effectively use data with the right strategy, it would change everything. The ideal approach would serve the most advanced data engineer but also be usable by those who don’t code. It should integrate with legacy, cloud, and modern data platforms, and be launched and learned in minutes, yet scale to the most sophisticated use cases across the organization and adapt to what they will be tomorrow.
About the author: Ed Thompson, CTO and co-founder of Matillion, started his career as an IBM software consultant before launching Matillion with co-founder Matthew Scullion in 2011, bringing together best-in-class technologies from across the software ecosystem and applying them to solving the deep and complex requirements of modern businesses in new and disruptive ways.