Where Big Data Fits Into 24-Hour News

Michael Brenner

Round the clock, rolling news changed journalism forever. A profession that was once based upon accuracy and carefully considered insight is now governed by speed, haste, and a desire to get ahead of the competition.

Responses must be fast, delivered while the news is still fresh in the audience’s memory. If this means responding before the full facts are revealed and digested, then so be it. Waiting for an understanding of the bigger picture before delivering a comment simply takes too long.

This sort of atmosphere was, perhaps, understandable in the formative days of rolling news. Broadcasters and news outlets were jockeying for position back then, and being first to break the big news stories and to offer background information to viewers was vital in getting the jump on the competition.

But now we live in an era of civilian journalism. Twitter has become a source for breaking news; from the Arab Spring to the atrocities in Aleppo, these sources are also putting pressure on traditional news agencies.

We should have moved beyond this by now. Twenty-four-hour news coverage is now over a decade old, enough time for the dust to settle and for broadcasters to have refined the balance between timely delivery and trustworthy, factual content.

Learn how to optimize consumer engagement across multiple platforms in our white paper "Attention: The New Oxygen for Digital Media."

Increasing unreliability

We are now entering an age of misinformation, in which fake news – whether distributed for satirical purposes or with malicious intent – is consumed en masse.

We’ve seen the effects of this, and the dangerous ramifications it produces, in the stories that were passed back and forth between camps during the recent U.S. presidential election, the Brexit vote, and other political firestorms around the world. In fact, this has been taking place for quite some time; it is just that now we are waking up to the peril.

The “speed over accuracy” attitude held by many modern news outlets is contributing to this danger and also eroding the trust that the public holds in mainstream news broadcasters. Statistics released by Gallup showed the percentage of Americans who distrust the media hit 60% for the first time in over 15 years in 2012, and that this figure had still not budged by 2014.

However, despite such negativity, 24-hour coverage is undoubtedly a positive step for news media. Current events do not unfold in strictly regimented bulletins, nor is the data they produce packaged neatly for our consumption. Instead, newsworthy occurrences happen at random intersections across time. The only way to provide an accurate reflection of this is via round-the-clock coverage, breaking the news as it happens.

So, a correct step for news coverage, undoubtedly, but one that has been handled in the wrong way. How can this negativity be overturned? How can the public acquire the 24-hour coverage they deserve?

Data-driven journalism

We cannot be naïve about this. We must understand that news outlets are businesses, operating in a competitive market. If they cannot deliver their products to their audience before the competition can, their profits begin to tumble.

But also, we cannot use this to absolve unscrupulous news agencies playing fast and loose with the truth. Especially not when there is a viable alternative, one that facilitates a balance between factual veracity and speedy delivery.

This is data-driven journalism, and it represents a way for news outlets and broadcasters to reconstruct a reputation that has taken significant damage in recent years. By deploying data in the news coverage they provide, media services can support the efforts of their journalists and take considerable steps towards guaranteed accuracy.

The philosophy behind data-driven journalism is that the story is already there; it is present in the raw data and is self-evident. However, the public has neither the time nor the tools to decode this narrative, so journalists must present the story in a more digestible manner, without deviating from its core truths.

How data-driven journalism works in practice

This means adopting a wholly new outlook on 24-hour news. Rather than passively receiving “facts” and interpreting them, journalists must delve into the data they have at their disposal and understand the message and the narrative.

From here, journalists filter the data – not to remove inconvenient elements but to ensure that the dataset is relevant enough to build a story around. This freshly mined data can then be visualized, and then, finally, crafted into a story. Each step of this four-step process is designed to enhance the value the public can derive from the data, culminating in a fully formed story.

Using this model, journalists can work swiftly and effectively, building a story that is as true to the raw data as possible and then delivering it to the audience in time, while it is still relevant. Staying true to the data requires this sort of mechanical process, which is in contrast to the rapid-fire release of comments that many news outlets practice today.

Of course, this four-step model cannot exist in isolation. It needs to be supported by great data. For a broadcaster or news outlet, amassing and storing supplies of highly credible, supremely reliable data is a primary duty. These organizations simply need to ensure that they maintain a diligent and responsible approach to the data they use to support their stories.

The data-driven outlets working today

Data-driven news media is not some idealistic dream, nor is it a distant possibility. Instead, it is a very real genre of journalism that is being practiced today by many media services and news broadcasters.

The Reuters Polling Explorer, for example, is a feature developed by the global news agency that enables users to drill down into data themselves, developing an understanding and gaining insight with minimal external influence. The idea is to revolutionize news delivery by putting the tools directly into the users’ hands.

In the UK, news outlets like The Guardian have demonstrated an ongoing commitment to data-driven journalism, highlighting how such techniques enable them to provide effective, accurate responses to their users. These two examples are not the only organizations pursuing this responsible angle; they are just a couple of the outlets flying the flag for data-driven protocols across the global news landscape.

Data-driven journalism is here and it is driving real change in the way we consume news media. Perhaps this is high tide for misinformation and low quality news services; perhaps this is the point at which a new chapter for journalism is opened.

To find out more, click here.

This post is the fourth of a seven part series entitled, “Reimagining Media in The Digital Age.” Check back weekly for further blogs in the series.


About Michael Brenner

Michael Brenner is a globally-recognized keynote speaker, author of  The Content Formula and the CEO of Marketing Insider GroupHe has worked in leadership positions in sales and marketing for global brands like SAP and Nielsen, as well as for thriving startups. Today, Michael shares his passion on leadership and marketing strategies that deliver customer value and business impact. He is recognized by the Huffington Post as a Top Business Keynote Speaker and   a top  CMO influencer by Forbes.