Unlocking The Data Challenge: Measuring Excellence (Part 5)

Christian Thisgaard

Part 5 in the 6-part Data-Driven Enterprise” series, which examines the challenges, leadership requirements, measurement models, and best practices to become a data-driven enterprise 

Measuring data-to-action lead times is critical to push and monitor annual improvements (incremental or leap). In this way, the organization reinforces awareness that becoming data-driven is a constant focus that everyone must prioritize. Further, by continually eliminating data-to-action bottlenecks, the organization becomes yet more agile, data-centric, and simple for both internal and external stakeholders.

In principle, measuring data-to-action value is not very different from the models used for measuring Overall Operational Efficiency (OEE) or Elliott Goldratt’s Theory of Constraints:

First, the enterprise needs to assess whether it has the required data at the granularity needed to discover new insights. Also, the data volume must be big enough to allow for the data to be interpreted correctly and interconnected with other data sources.

The quality of the data is also essential to ensure that predictions and recommendations can be trusted. The right governance procedures must be in place to evaluate the reliability of the sources and monitor data lineage. Harmonizing data from different sources is also crucial to ensure analysis results are accurate.

The speed at which data can be accessed, processed, and analyzed impacts a company’s ability to act in the moment. Depending on the use case, the acceptable delay between data-to-insight and insight-to-action can vary. The use case also determines whether the data and resulting insights fit the end purpose. Considerations about whether the data is meant for scientists or business users, for BI tools or robotic automation, also plays an important role in ensuring that the project hits the mark.

Amount: Securing enough granular data on the customer, process, or event to do advanced modeling of the variances and determine the correlations associated with positive and negative impact
Caution: Enterprises may need to upgrade their operational systems or add or acquire more granular external customer data to get sufficient data and apply advanced predictive algorithms to their business.
Quality: Transforming, cleansing, and normalizing the raw data to be fit for analysis
Caution: The closer data is captured at the real-world level, the more uneven the data tends to be – such as sensor data, online data, and so on. In this case, significant data manipulation is required to identify what is normal behavior, what non-normal behavior the data represents, which data trends are significant, and how to tie the data together with other sources/systems.
Usage: Using the new insight to improve experiences, services, and operations
Caution: Leveraging new insights effectively at enterprise scale has proven to be complex, as described above. Many enterprises have so much information hitting business that they ignore the input as noise as they don’t trust it or because it conflicts with other information.

The conclusion of this series explores how to begin and the best practices to use.

Read the other posts in the Data-Driven Enterprise series here.

Christian Thisgaard

About Christian Thisgaard

Christian Thisgaard is global vice president of SAP’s Customer Co-Innovation Center, focusing on driving innovation with customers and partners to enable them to operate in a more data-driven and autonomous way. With his experience in technology-driven innovation, combined with his strategic advisory background, he has helped Fortune 500 companies through complex transformations, instituting new structures, processes, and IT globally. Within SAP, he has led several strategic programs towards market positioning, global processes, and solution roadmaps. He has broad international exposure, having lived in Europe and APJ, and is now based in Texas.