Follow Datanami:
September 26, 2016

Inflexible Data, Analytics Fueling Failures, Survey Finds

You would be hard pressed to find a business executive who does not believe data initiatives are critical to company growth. Still, a large number of companies say their initial data initiatives have failed due to issues like “data inflexibility.”

That’s the key finding of a data analytics study compiled Monday (Sept. 26) by Dimensional Research, which reported that a whopping 88 percent of executives said they have experienced “failures” with recent big data projects. The key reason for the flops was “data inflexibility,” particularly existing analytics infrastructure.

Fully three-quarters of those surveyed said a rigid data infrastructure prevented them from “acting on business requests.”

The reasons varied from dissatisfied users (36 percent) or less-than-expected use of new data platforms (40 percent) to costs exceeding expected returns. Meanwhile, 31 percent cited “zombie projects” that were almost complete but never quite ready for use.

Just over half of respondents said the leading cause of failed data initiatives was complex or inflexible data and analytics infrastructure that fell short of users’ requirements. That was followed closely (47 percent) by a lack of technical expertise.

Those types of complaints have fueled vendor efforts to deliver “self-service” analytics platforms that are easier to use and don’t necessarily require data science skills. Not surprisingly, business executive surveyed by the market researcher complained most about inflexible data and infrastructure while frontline IT managers were ” more willing to accept a complex architecture and try to make it work even though it is less than desirable.”

Still, the survey found that more than two-thirds of those queried said existing data infrastructure does not allow them to perform desired analytics projects. For example, 71 percent said inflexible infrastructure makes it difficult to troubleshoot data or analytics tools.

Meanwhile, the data skills gap continues to grow, with respondents reporting shortfalls in technical expertise (in descending order) for “database tuning,” data science and engineering, Hadoop, distributed programming and SQL. Only 15 percent of respondents reported no challenges finding technical expertise.

The survey, which was commissioned and released by cloud data warehouse vendor Snowflake Computing, used all this metadata on inflexible data analytics and infrastructure to make the case for cloud analytics, which the researcher insisted “has the potential to deliver these benefits since the types of infrastructure inflexibility issues that can cause data projects to fail are handled by experts that focus only on these issues.”

Among the advantages of cloud analytics, according to the survey, are: faster deployment of data initiatives, reducing the time required to make data available to analysts, shrinking infrastructure management overhead and the availability of standard interfaces and tools.

Moreover, the survey found that the “pay-as-you-go” cloud analytics approach was appealing for more than just the cost savings: 92 percent of respondents said a cloud licensing model would provide the needed flexibility to experiment with their data efforts.

Dimensional Research said it surveyed 376 executives and data managers during the summer of 2016. All were using data warehousing, Hadoop or NoSQL technologies along with business intelligence applcations.

Recent items:

Another Self-Service Database Tool Emerges

Why Self-Service Prep is a Killer App for Big Data

Datanami