The runup to the 2016 U.S. election is being covered in interesting new ways by the political media, with analysis of Big Data and real-time opinion polling offering journalists much deeper insight than ever before. The trend of “data journalism” is peaking as the media embraces advanced technologies that allow them to deliver a new breed of numbers-driven, fact-based journalism.
The tools being used for data journalism open up possibilities for fresh perspectives, more in-depth reporting, and new stories behind the numbers that have never been seen before. Traditional journalists are beginning to see how data journalism can complement their reporting, and the U.S. election is serving as an ideal testing ground. Political reporters are lapping up the improved data literacy and access to objective analysis, which is helping to make their reports more thorough and informative.
Consequently, American voters are becoming digital voters. They have access to real-time, data-driven information and public sentiment, which is empowering them with broader insight. They’re relying on this to help them make up their minds before they cast their vote, and it’s given many voters a renewed interest in becoming informed citizens able to make an educated choice.
However, the rise of data-driven journalism brings with it a potential pitfall for media organizations and readers alike. Digital information overload will bring about a fatigue around numbers if reporting quantity becomes more highly valued than quality. Having access to mountains of data is a huge benefit, but a reporter still has to be a journalist first to ensure they’re not getting buried under the numbers and missing the stories.
In other words, a political journalist still needs to be a politico, not just a statistician. They could fall into the trap of placing too much importance on meaningless correlations as indicators of voter sentiment, losing their grasp on what made them a great political reporter in the first place. As data gets bigger, this will become harder to resist. So they need to become experts in making Big Data small—rather than obsessing over the numbers, obsessing over figuring out what they really mean. In doing that, they have an unprecedented opportunity to make people more informed rather than simply overwhelming with them a series of conflicting data sets.
Some media organizations are already tackling the challenge of remaining relevant in a world of information overload. Using big data and visualizations, they are making great strides in making data journalism more accessible to reporters, politicos, and voters, which is proving its worth in giving political reporting a new lease of life.
Reuters’ Polling Explorer tool is an example of how this is being done, offering up customizable data visualizations focusing on the biggest talking points in the U.S. leading up to the election. It’s an entirely new scale of public opinion measurement, presented in a way anyone can understand and use, while enabling Reuters to usher in its own improved brand of accurate, fact-based, and timely journalism.
We can see the true potential of using real-time data analysis to measure up-to-the-minute public opinion in one poll on the most important problem facing the US today. Immediately after the Paris attacks in November, terrorism skyrocketed way above the economy as the number-one issue, rising sharply again straight after the December San Bernardino attack. For Reuters, this is just one of many examples of their greatly increased ability to find outliers in the data.
Reuters Polling Explorer runs on SAP HANA, an in-memory data platform that allows Reuters to access and analyze 100 million survey responses for quicker and more efficient reporting of public opinion.
For more on data analytics in today’s media environment, see How Big Data Is Changing The News Industry.