Follow Datanami:
May 24, 2016

Big Data Doesn’t Always Mean Better Business

Elbert Hearon

(Lightspring/Shutterstock)

There is an unprecedented volume of data being created, with an unprecedented number of people around the world regularly producing and storing data. Research shows that 90 percent of the data in the world today was created in the last two years alone. This may not be news to those of us who plan for, manage, or process this barrage of data, but questions still remain about best practices when taking on infrastructure changes to address big data in a big way.

Without structure, big data is simply noise; massive amounts of information exuding from a large, and growing, pool of internal and third-party sources. What was once a question of how do we extract, process and store the data, is now a question of how to manage, analyze and operationalize accurate insights from this data.

It is very important to get data right, but why are so many organizations mishandling and misunderstanding data?  Businesses want to quickly deploy, track, and categorize data, but they still don’t understand the basic concepts of what these new data elements mean to the business. Many organizations also utilize big data on a presumed basis without any verification for accuracy leading to analytical insights that are based on faulty assumptions.

Make Sure Bad Data Doesn’t Take You Off Course

Big data means that once minute bad data problems now become magnified as big problems that can cause flaws with the very analytical results that were painted as one of the values of big data. As the proliferation of data continues, it is very important for organizations to get data management right. IT needs to wrap its head around big data, from start to finish.shutterstock_data_cleaning_TunedIn by Westend61

Data volume continues to grow, and new and third party data sources pose an abundant challenge to information technology leaders. To meet this challenge, IT leaders must equip themselves with a plan, one that spans across the entire enterprise and breaks down the walls between departments.

IT must think bigger and manage the flow of data within the entire organization. By adding data analytics and data integrity to instinctual decision-making, an organization that combines its business and IT departments can rise above itscompetitors, especially as data quality improves and analysis becomes more advanced. This collaboration of the business and IT departments may elevate the role of data officer to the C-suite. The Chief Data Officer will be the catalyst for change to the wide spread adoption of enterprise data governance to elevate data from a tactical to a strategic asset that is used for competitive advantage.

This seems so simple, but why isn’t it? Bad data can easily take you off course. Big data means that once small bad data problems now become big problems, causing flaws with the very analytical results that were considered virtues of big data. On top of that, the IT department and business department typically are not working together to fix these issues.

Data Governance Automation Stops Bad Data in it Tracks

Sound governance principles rely upon automation to safeguard data integrity by monitoring the health of the data. With automated, continuous data integrity checks and deductive analysis, organizations can guarantee data anomalies will be flagged and reconciled, and workflow processes will route exceptions to the correct people for timely resolution preventing bad data from proliferating. Implementing enterprise data integrity checks and balances guarantees the IT department is providing the business departments with accurate data to drive business decisions.

(Graphicworld/Shutterstock)

(Graphicworld/Shutterstock)

This kind of automated data integrity and quality monitoring provides the basis for the analytics and business teams to determine and quantify the impact of such issues on analytics outcomes. A cost-benefit analysis can determine if there’s value in fixing identified data integrity/quality issues. It’s possible that fixing the data integrity/quality issues may be significant enough to reap the benefit of the analytics. The analysis may also show fixing the issues may not be necessary. In this situation, the automated monitoring ensures that the impact doesn’t vary over time. Automated data integrity and quality monitoring identifies and manages sudden changes in integrity/quality due to outside reasons (e.g., system modernization, introduction of a new process/product, etc.), and therefore analytics.

For example, consider a commercial airline cruising along from point A to point B. Once the flight reaches its altitude, the pilot typically turns on autopilot. At 30,000 feet and traveling at 500 miles per hour, small alterations driven by human error can mean the difference between landing at Chicago O’Hare on-time vs. finding yourself hundreds or thousands miles off course. Automation in this case prevents such egregious errors from happening and saves money in fuel and gives the pilots a much needed break to be prepared for unexpected circumstances.

This is why it’s better to automate processes and let machines do what they do best.

Similarly, problems with data quality can take your business decisions off-course. End-to-end analysis of data can diminish risks by automating the process and, in turn, lowering the cost of ensuring quality data from the beginning.

Like a pilot off-course, bad data gets worse when organizations assume data is trustworthy and then apply the information to data analytics when making decisions. When companies ignore bad data it can result in poor insights, inaccurate interpretations and off-the-mark results. As they say, garbage in, garbage out – or, in the above example, finding yourself off course and missing an on-time landing.

Ensuring data dependability should be the first step to put big data into action. Automated, integrated data quality solutions can help IT keep their organization on a true path, getting your business to the proper destination every single time.

Organizations which have a strong collaboration between business and IT departments will be a step ahead in solving the bad data dilemma. Making data integrity/quality considerations a part of any big data and/or analytics initiative is a must in today’s business environment.

 

Elbert Hearon

About the author: Elbert Hearon is a senior solutions consultant at Infogix.  His role is to advise prospective and existing clients on the optimal use of Infogix’s Data Analysis Platform.  He has more than 20 years of experience in teaching quantitative courses at various colleges and universities.  Concurrent to Infogix, he teaches courses in predictive analytics and advanced data mining at Northwestern University and Elmhurst College. Elbert has a Master of Science degree in Applied Statistics from DePaul University, Master of Business Administration from The University of Chicago Booth School of Business, and a Bachelor of Arts in Mathematics and Economics from Indiana University. 

Related Items:

Four Steps to Transforming Your Organization with Data Virtualization

Merging Big Data Analytics Into the Business Fastlane

Unleashing Artificial Intelligence with Human-Assisted Machine Learning

Datanami