Follow Datanami:
November 8, 2016

Election 2016: A Rich Opportunity for Data-Centric Storytelling

(Carsten Reisinger/Shutterstock)

Get ready to consume some data. If you’re following the election today, you’re likely to be immersed in it. Thanks to a new generation of interactive data visualization tools, news junkies have powerful new ways to ingest large huge amounts of information, from both the real world and cyberspace.

Big news events like the 2016 presidential election have become opportunities for vendors of data visualization tools to strut their stuff. Check out the story “8 Interactive Visualizations for Election Day” to see how some of the top names in data visualization are handling the event.

You can also find compelling visualizations on practically every major news site. Many of these sites use interactive visualization tools from major vendors. For instance, the interactive visualizations you’ll find at Atlanta Journal Constitution, Euronews, the National Post, and ZDNet are developed using software from analytics and visualization software vendor Qlik. Other vendors that are represented in the nation’s newsrooms include Tableau, Alteryx, Microsoft and others.

James Fisher, vice president of global product marketing for Qlik, says data and interactive analysis is playing a greater role in forming public opinion about issues of the day.

Trump and Clinton

The US presidential election is expected to draw about 130 million voters

“What we’re trying to facilitate is to leverage visualizations as a way in which the consumer of that story can more effectively interact with data that supports the story,” Fisher tells Datanami. “They can explore the data and draw their own conclusions and their own insights from the data that relates to that story.”

As a consumer of the news, it’s hard to beat an interactive visualization that’s been carefully crafted to convey how different pieces of data impact each other. For example, in today’s election, news watchers will be looking to see how metrics, like voter turnout in individual states, helps to swing contests one way or the other.

Whereas graphics may have supported stories in the past, interactive visualizations today are the story. A picture may be worth 1,000 words, but an interactive visualization may be worth much more.

“It’s increasingly much more about an interactive journey for the consumer, as opposed to the presentation of data that backs up the story,” Fisher continues. “That’s really what we’ve been trying to do with all the work we’ve been doing over the last year, whether it’s the presidential election in the US or whether it’s about MLB or whether it’s about golf–we’re trying to create that interactivity in the story and the data that supports it.”

Visualizing Voter Sentiment

A favorite source of data for interactive visualizations today will be Twitter, particularly for anybody looking for hints as to which way the presidential election will swing.

twitter-analysis_pres_election

After the election, many analysts will look to Twitter to gauge popular sentiment of the outcome

While many big data analytic professionals have disparaged the social media network as being full of garbage, it’s actually quite useful and highly accurate for sentiment analyses during controversial events, according to Expert System, a developer of semantic intelligence tools.

“We think that when we deal with polarizing issues, like referendums or bipartisan elections, Twitter data offers a better indication” than standard polling, Expert System told Datanami earlier this year after the company correctly predicted the results of the Brexit referendum. British voters narrowly elected to withdraw from the European Union, which contradicted traditional polling conducted in the days before the vote.

Twitter data is better than standard polling data for two reasons, the company said. “First there is enough content to make the sample size significant, and the positions are more clearly expressed,” the company says. Secondly, the kind of analysis on Twitter is much more precise. It’s better at answering close-ended questions like “do people prefer brown or black dress shoes for next winter” compared to “let’s see what shoes people want for next winter,” the company says.

Closing Polls

For better or for worse, the world has moved beyond standard polling, and that trend is definitely on display today. While the presidential candidates bicker over whether individual polls are accurate, the shifting demographics of the American populace–and the decline in accuracy of traditional polls conducted largely by sampling pages from the phonebook—mean we’re more heavily dependent on analytics to get an accurate picture of reality.

nate-silver_election

FiveThirtyEight founder Nate Silver uses ensembles and his own research to try to beat the 3-4% sampling error in most polls

It’s all about having a greater understanding of how the data interrelates, Qlik’s Fisher says. “What we’re seeing in opinion polls in this election and elections that have gone before reflects that increasing complexity and maturity in the way that people are leveraging different types of data and bringing different types of data together to generate new insight and hopefully improve the accuracy and trustworthiness of that insight,” he says.

Fisher looks to Nate Silver, the founder of FiveThirtyEight, for clues on how best to combine different models and data sets in the political realm to create something that’s greater than the whole.

“I think that the models that Nate Silver has put together are extremely interesting and extremely powerful,” he says. “I think what that team is doing is showing some best practices in understanding different data sets, understanding the relationship between those data sets and using combinations of data to drive hopefully new and more accurate insights.”

Calling a Winner

Data blending and analytics analytics will also play a role in calling the winner of today’s presidential election.

bushvgore

The national press corps employs a team of analysts to ensure we don’t have a repeat of the 2000 election

As Time reporter Haley Sweetland Edwards wrote in a November 3 story called “How the Media Got Smarter About Calling Elections,” the advent of the National Election Pool (NEP) relies heavily on data analysts to ensure that we don’t have a repeat of the debacle in the 2000 presidential election, when the media first reported that George Bush won the election, then switched to Al Gore, before flip-flopping once again hours after the last polls closed. (The Supreme Court, you will remember, decided that election.)

To ensure the accuracy of media reporting of the election—and specifically to ensure that decisions to “call” results are accurate before every last ballot has been counted–the Associated Press, along with the five NEP members (ABC, NBC, CBS, CNN, and Fox News) created an elaborate process to handle exit poll data.

The process involves data analysts who are quarantined in two computer-less and phone-less rooms in Washington and New York. After the exit-poll data is securely transmited to the rooms, the quants analyze the data and perform necessary demographic corrections to it to ensure that it’s accurate, Time reports. It’s up to the analysts to decide if the data supports making a decision to call the election one way or the other–but only after the last poll has closed.

Will the analysts working for the national press corps use data visualization tools to help them make corrections? We’ll never know. But one thing’s for certain: interactive data visualizations are here to stay.

Related Items:

8 Interactive Visualizations for Election Day

Why Winning Politics Is Now Tied to Big Data Analytics

Datanami