Crossing The Big Data Analytics Chasm

Bill Schmarzo

The key to any organization’s digital transformation is becoming more effective at leveraging data and analytics to power their business models. That is, how can organizations exploit the growing bounty of internal and external data sources to uncover new sources of customer, product, service, operational, and market insights? How can they then use those insights to optimize key business and operational processes, mitigate compliance and cybersecurity risks, uncover new monetization opportunities, and create a more compelling, differentiated customer experience?

Becoming more effective at leveraging data and analytics is forcing organizations to move beyond the world of business intelligence (BI) to embrace the world of predictive and prescriptive analytics. Business intelligence is about descriptive analytics: retrospective analysis that provides a rearview mirror view on the business—reporting on what happened and what is currently happening. Predictive analytics is forward-looking analysis: providing future-looking insights on the business—predicting what is likely to happen (associated with a probability) and what one should do.

There is a natural organizational analytics maturation in moving from BI to predictive analytics and prescriptive actions:

  • Descriptive questions: Use BI and data warehousing to support management and operational reporting, and dashboards using aggregated data. Descriptive analytics answers the question: “What has happened?”
  • Predictive analytics: Use statistical models to quantify cause-and-effect to predict what is likely to happen or how someone is likely to react (i.e., a consumer’s FICO credit score predicting likelihood to repay a loan). Predictive analytics answers the question: “What is likely to happen?”
  • Prescriptive actions: Use optimization algorithms to prescribe actions to improve human decision-making around outcomes. Prescriptive analytics answers the question: “What should we do?”

The analytics chasm

Unfortunately, for many companies with whom I talk and teach, there is an “analytics chasm” that is hindering the transition from descriptive questions to predictive analytics and prescriptive actions. This chasm is preventing organizations from fully exploiting the potential of data and analytics to power the organization’s business and operational models.

Many think that crossing the analytics chasm is a technology issue, so they throw technology at the problem (and consequently right down the bottomless technology chasm). Forever in search of the technology “silver bullet” (the newest technology that magically solves the analytics chasm challenge), IT organizations continue to buy new technologies (or continue to invest in outdated technologies) without a good understanding how what it takes to cross the chasm.

And the answer to this challenge? Economics!

Crossing the analytics chasm

Yes! The key to crossing the analytics chasm is understanding and mastering the economic value of Big Data: being able to exploit the potential of Big Data and data science to create new sources of value creation.

Economics is the branch of knowledge concerned with the production, consumption, and transfer of wealth.

Crossing the analytics chasm requires an understanding of economics and how the organization can leverage digital economics to identify and capture the new sources of customer and market value creation. Crossing the analytics chasm requires:

  • Transitioning from an organizational mentality of using data and analytics to monitor the business to predicting what’s likely to happen and prescribing actions to prevent or monetize that prediction
  • Maturing beyond aggregating data to control the costs of storage and data management to a mentality of hoarding every bit of detailed historical data, complemented with a wealth of external data sources (social media, weather, local events, economic, demographic) about every customer, employee (physician, teacher, engineer, technician, mechanic), product, device, and asset
  • Expanding data access from a restrictive data-access model (because it’s easier to walk on the sun than to add a new data source to your data warehouse) to enabling access to all data – internal or external – that might have value, given the business and operational decisions the organization is trying to optimize
  • Transitioning from batch data processing (and praying that your ETL programs can meet the SLA windows) to an operational model that can process and analyze the data in real time or near real time – to “catch the business in the act” and thereby create new monetization opportunities

As we discovered in our Economic Value of Data (EVD) research at the University of San Francisco, leading organizations are embracing the economic value of Big Data to leap over the analytics chasm.

For more details on the University of San Francisco research project, check out “Applying Economic Concepts To Big Data To Determine The Financial Value Of The Organization’s Data And Analytics Research Paper.”

Bill Schmarzo

About Bill Schmarzo

Bill Schmarzo is CTO, IoT and Analytics at Hitachi Vantara. Bill drives Hitachi Vantara’s “co-creation” efforts with select customers to leverage IoT and analytics to power digital business transformations. Bill is an avid blogger and frequent speaker on the application of big data and advanced analytics to drive an organization’s key business initiatives. Bill authored a series of articles on analytic applications, and is on the faculty of TDWI teaching a course on "Thinking Like A Data Scientist." Bill is the author of “Big Data: Understanding How Data Powers Big Business” and "Big Data MBA: Driving Business Strategies with Data Science." Bill is also an Executive Fellow at the University of San Francisco School of Management, and Honorary Professor at NUI Galway at NUI Galway J.E. Cairnes School of Business & Economics.