Throughout the history of business computing – from punchcards to artificial intelligence – companies have sought to extract value from their data. Go back to the 1970s and 80s, when mainframes dominated the scene. The objective then: Use data to identify materials requirements and optimize plant floor operations.
Jump to the client-server days of the 1990s and the early Internet. The objective then was end-to-end business-process automation. Companies used their data to standardize processes throughout the organization and run more efficiently.
The Internet drove widespread connectivity, which brought us to the age of Big Data. The companies that won the day paired massive volumes of data with cloud and mobile technologies to deliver better customer experiences. Think of Amazon, Airbnb, or Netflix.
The intelligent enterprise
Today, organizations still seek to maximize the value of data – but now the possibilities have expanded tremendously. For many organizations, the goal is to become what we at SAP call the Intelligent Enterprise.
With help from intelligent technologies (IoT, machine learning, blockchain, and more), organizations now understand that they can meet demand more effectively by using their data to predict what customers want before they ask for it. Based on this predictive power, many organizations are also developing entirely new business models that improve overall outcomes for customers through better analysis and insight.
But to get there, organizations need to get their data management right.
Access, performance, and trust in your data sources
The simple fact is, with so much data available today, today’s business advantage goes to the organization that manages it best. One challenge is that data sets are increasingly distributed across networks, making access difficult. It doesn’t help that organizations have traditionally kept live data in one place and analytical data elsewhere – such as in business warehouses where it can be isolated and manipulated for reporting purposes.
Another related challenge is speed. With analytical data residing off to the side, organizations can expect delays as experts move data in batches to generate the analysis needed. This takes time, which is something most organizations do not have in the digital economy.
At SAP, we advocate an in-memory database approach as a solution to the twin problems of access and performance. With the cost of memory falling, it is now feasible for organizations to store all data in active memory where access is lightning fast. With transactional data and historical data in the same place, you can run analytics on live information – in the moment.
Additionally, we advocate a virtual data access approach, where a central data management system can virtually access a wide range of data without moving it from its location. By leaving data where it resides, it’s safe and utilized only when needed for analytics. This approach reduces data-administrator overhead and dramatically simplifies the data architecture for advanced analytics versus typical architectures of the past.
The speed of in-memory and simplicity of virtual data access is augmented by the fact that all data is managed in a way that makes it possible to extract value using intelligent technologies. Now you can focus analytical algorithms on all of the data without any of the technical overhead required to optimize calls to disk.
Take, for example, the scenario of predictive maintenance. A manufacturer of, say, HVAC machines can now use IoT sensors to track the health and status of machines deployed across a customer’s industrial facilities. All of this data gets ingested and stored in memory, where algorithms can detect patterns that predict machine failure before it happens.
The cost of maintenance, of course, is cheaper than the cost of repair – so now, your ability to predict machine failure allows you to optimize maintenance processes. Downtime for your customers is minimized as well. And with machine learning, you can even improve your algorithms over time. This means that you continuously get better at your predictions, and at delivering the outcomes your customers expect.
A digital core for the intelligent enterprise
Predictive maintenance, of course, is just one of many possibilities with an approach to data management worthy of the digital economy. Colgate-Palmolive, for instance, manages its data to form a digital core that drives much of its success.
A customer of SAP since 1994, Colgate-Palmolive has lived through all the twists and turns of business computing over the years. Today, with in-memory data management capabilities and advanced analytics, the company has dramatically improved business planning. Instead of periodic structured reviews from reporting systems that show where the company was standing yesterday, planners can now use live data to review the state of the business at any time in real time – which speeds decision-making. And with a simplified digital core now in place, the company can move forward with intelligent technologies such as machine learning and the Internet of Things to derive even more value from its data – and then pass that value along to its customers.
See the Gartner video
Interested in how a modernized approach to data management helps drive the intelligent enterprise? See the new video from Gartner for a deeper dive.