Follow Datanami:
November 1, 2018

Focus on Business Processes, Not Big Data Technology

(BigBlueStudio./Shutterstock)

The rapid evolution of analytics has put a wonderful array of cutting-edge technologies at fingertips, from Spark and Kafka to TensorFlow and Scikit-Learn. And yet, despite this technological treasure trove, the vast majority of big data projects fail, according to analyst firms. So what gives? It’s likely a combination of factors, but one that stands out is that we spend too much time focusing on technology and not enough on business process, industry experts say.

Gartner analyst Nick Heudecker turned some heads last year when he said 85% of big data projects were failures, citing poor integration with existing business process, as well as internal policies, executive buy-in, lack of skills, and the ever-present security-governance issue. Other tallies of big data, data science, and advanced analytics projects have turned up similar statistics about the rather narrow odds of success in big data.

The fact is, losing at big data is a lot more common than wining. The human skillsets required to stitch together complex, largely open-source technologies into something that’s enterprise-grade and contributes value to the business are not easy to find. This is something that the Hadoop ecosystem has been grappling with for the past five years, which in turn has driven scads of businesses into the arms of comfy clouds, where customers can avoid all that muss and-fuss by utilizing pre-built, pre-integrated, pre-configured big data systems.

There’s no doubt that this is tough stuff, which isn’t a reason to avoid it. To paraphrase President John Kennedy, we didn’t aspire to put a man on the moon because it was easy, but because it was hard. Most of life’s most worthwhile accomplishments are difficult, which makes those victories that much sweeter.  But at the same time, there’s no point in over-complicating what is already an extremely complicated enterprise.

Bill Schmarzo, CTO of IOT and Analytics at Hitachi Vantara

Bill Schmarzo, the Dean of Big Data, has hammered this point over the years. While at Dell EMC, he encouraged customers to apply the SAM test – which assesses whether it is strategic, actionable, and material — to any potential data analytic they’re thinking about utilizing in their business. Now Schmarzo the CTO of analytics and IoT at Hitachi Vantara, but he continues to stress the importance of business processes.

During a keynote address at Hitachi Vantara’s NEXT conference in September, Schmarzo encouraged the audience to re-frame the conversation away from focus on analytics.

“This conversation needs to be about business outcome,” he said. “You hold the organization’s single most important asset – data — and it’s out of that data we can glean customer, product, service and operational insight that we can use to improve our business and operational process to mitigate risk and compliance issues, to uncover new revenue streams, to deliver a compelling and differentiated customer experience.

“It’s not an individual technology transformation,” he continued. “It’s a digital business model transformation. It’s transforming the business model with the data and analytic insights you glean from that.”

Teradata CTO Stephen Brobst said similar things during a session at the Teradata Analytics Universe conference last month. During the session, Brobst encouraged attendees to think about how analytics projects can impact business processes, specifically within the context of “real time” analytics.

“People talk about real time as some kind of marketing buzzword,” he said. “A much better term is right time. Right time is not defined by technology. Right time is defined by your business process.”

The goal of a “right time” analytic system, Brobst continued, is to drive as much latency out of the decision-making process as makes business sense. The value that a company can get from making a good business decision based on analytics drops as time goes on. But that value doesn’t drop equally for all businesses in all types of industries and all circumstances, so it’s critical to take the specifics into account.

For example, an e-commerce retailer who needs to act before a customer leaves a website, a decision needs to be made quickly. A bank needs to make a decision on whether a transaction is likely fraudulent before moving that data. But for most strategic decisions, fresh data isn’t that important.

“If your strategy is based on the last two hours’ worth of sales, you’re in trouble,” Brobst said. “Strategy does not need the most up-to date data….We don’t need real-time data to make good [strategic] decisions. In fact it’s probably distracting.”

Real-time data becomes more valuable when it comes to executing a business strategy. And the value proposition of data diminishes with each increment of latency introduced into the business process, he said. It can be tremendously expensive to refresh a Teradata environment with the latest data every 30 seconds, which is why it’s critical to match the acceptable latency with the strategy and the business process goal.

“Right time is aligning the technology investment that you make with your business process,   because if I did ‘real time,’ but your business processes are still operating daily, then we spent a lot of money and didn’t add any value,” Brobst said. “The cost of really doing it in 30 seconds is more than the value. For their business processes, 15 minutes is good enough.”

Another problem with big data technology is that it often never makes it out of the data scientist’s sandbox, Brobst said.

“What I observe is a lot of organizations do this great data science and they never actually put it into production, or their version of production is it’s running in a desktop or a closet with huge operational list behind it,” he said. “You don’t get to count your money until it goes into production. So make sure that you productize – out of the closet or out from under the desk – and put it into a properly governed environment.”

Teradata CTO Stephen Brobst

That’s not to say that technology plays no role. Technological advances, in the end, are the sources of the big data revolution that continues to unfold. Deep learning technology may be overhyped, but it also will likely play a big role in corporate decision-making, including automating many decisions that today are made by humans.

“These technologies have huge impact on all business,” Brobst said about deep learning. “It’s not just about the dot-com business change. Its’ about changing process for companies like this – heavy industrial companies doing smarter things with their data.”

But in the end analysis, if we want to succeed with big data analytics and the array of AI technologies coming down the pike, we’re going to need to think more intelligently about how we apply them, and be very deliberate in adapting specific business processes with them, because the alternative is just more spending on failed big data projects.

“It’s not just about technology. It’s about business process change, and if it doesn’t happen, your technology delivered no value,” Brobst said. “The hardest part is not the technology. It’s changing the way you think. Humans are much harder to change than technology, so business processes are a part of it.”

Related Items:

Deep Learning Is Great, But Use Cases Remain Narrow

5 Reasons Data Science Initiatives Fail

One Deceptively Simple Secret for Data Lake Success

 

Datanami