Follow Datanami:
October 9, 2012

Seven Steps to Revamping Data Approaches

Ian Armas Foster

Everyone is looking for advice on big data these days. Right now, many businesses are simply wading in the shallow end of the big data pool instead of swimming laps across it or surviving the big data tidal wave instead of surfing it. No matter which aquatic metaphor you prefer, the point is to succeed rather than exist.

Yellowfin CEO Glen Rabie and Actian Vectorwise General Manager Fred Gallagher put together a webinar to address that topic, coming up with 7 Best Practices for Big Data and BI. The team offered some suggestions that seem to best apply to smaller organizations who are agile enough to at least experiment with some new ideas that are helpful. And yes, while this (not surprisingly) devolved into a demonstration on how Yellowfin products help achieve those goals, some of the salient general points they offered are certainly worth repeating.

Harnessing big data’s power is important to those medium-to-large companies that wish to remain competitive. This is not a revelation. It sometimes helps, however, to better understand the global scope of exactly how big the data is and how it means to be used. According to Rabie’s projections, there will exist 7.9 zettabytes of available data in the year 2015.

Though superfluous, the word available in that sentence indicates that companies are laying their hands on far too much of the data simply because it is available. Indeed, according to an MIT Sloan study, 60% of organizations take in more data than they can effectively use.

It has been repeated over and over on this site how simply investing in big data without much of an idea as to what exactly to do is a terrible idea. And Oracle’s CEO Mark Hurd himself wants you to know that you will fail in this for the same lack-of-focus reasons Yellowfin and Actian Vectorwise point to. However, business managers expect a lot out of big data analytics. Indeed, according to Rabie via a Harris poll, 70% of executives who invest in big data expect a return on that investment within a year.

In order to make well on those expectations, an intelligent plan such as the one Gallagher proposes is required. His seven-step covers the analytics process from initial goals to user-friendly output and follows thus:

  1. Focus on what you want to achieve
  2. Identify the data you have vs. the data you need
  3. Use the right big data tool for the job
  4. Use a fast database
  5. Plan for a mixed architecture
  6. Ensure mass distribution of your data
  7. Tailor data delivery to each audience

Each step in itself is important, but taken together they represent a concise path each company interested in data analytics should take.

In a poll of 325 business managers, 61% responded that big data capabilities would fuel “Better targeted social influence marketing.” Even though each respondent was allowed five answers, that response was the only to garner a majority, with the admittedly vague “more numerous and accurate business insights” coming in second with 45%.

That personalized marketing is a main goal of those looking to implement analytics is not surprising; it is another topic covered ad nausuem here. Those 61% then would want to focus on leveraging sentiment analysis, user-generated media, and social interactions. Not surprisingly, those types of data are most complex and expensive to deal with for a couple of obvious reasons.

The first is that the sheer volume of that data consistently measures in the petabytes. The second is that it is almost always unstructured, meaning simple BI tools and relational databases have trouble processing it. As a result, Gallagher recommends a multi-database approach. For example, one could use Hadoop for more social-based analytics in concert with the relational databases that make sense of the transactional.

In the fast-paced business world, sometimes too much is expected of a system that is trying to comprehend terabytes or petabytes of data at a time. According to Gallagher, web-based users look for answers within ten seconds. Mobile users look for responses in three.

Per Gallagher, slow query performance is the number one reason for BI failure and frustration. For the most part, it boils down to investing in the right fast database. Gallagher claims that 70% of data warehouses experience “performance constrained issues.” That is clearly not good enough.

Gallagher notes that bigger warehouses are overrated and that performance can be streamlined by economizing necessary hardware.

Of course, actually processing the data is important. But the vast majority of those working with data will not be responsible for overseeing that architecture, they will be the end users. The presentation culminated in demonstrating Yellowfin’s real-time visualization of global electronics sales.

Datanami