For more than 40 years, organizations have been forced to run operational and analytic processes on different systems. The latency of disk-based databases and the high cost of live memory meant that combining operational and analytic processes just didn’t make economic sense.
Now new technology and falling costs are overturning a generation of analytics best practice. It’s becoming faster, simpler, and cheaper to use a single in-memory system for both operations and analytics.
There have been roughly three phases of in-memory development:
(1) In-memory databases
This idea is far from new (one of the first to tout this approach as a differentiator was TM1 by Applix in the 1990s). But the economics have changed radically since the advent of 64-bit systems, and in the last few years in-memory technologies such as SAP HANA have proved their worth helping radically speed up and simplify business user access to information.
(2) In-memory analytic platforms
The power of in-memory isn’t limited to traditional structured data processing. It is also particularly well adapted for other types of analytic processing that require complex, high-speed calculations, such as predictive analytics. Today, this kind of processing is typically carried out using systems that are separate from the main database. Bringing them into a single system again simplifies and speeds up business analysis.
The latest version of SAP HANA integrates support for in-memory text analysis, predictive analytics, big data, and business calculations such as complex profitability and costing, or dynamically reallocating budgets.
(3) In-memory platforms
SAP HANA now includes full featured application server, web server, and development environment that allows developers to leverage the power of in-memory, pushing as much of the complex logic down into the database as possible.
In 2012, SAP started releasing applications that combine the best of operations and analytics in a single solution running on SAP HANA— including SAP CRM , SAP Business One, SAP Sales and Operations Planning. And SAP has just announced that the company’s core business suite of applications now runs on SAP HANA.
In-memory has already become the architecture of choice for the recent generation of cloud-based application vendors – now existing companies can get the same benefits for their on-premise systems, without radical disruption to existing applications.
The Benefits of Convergence
The ability to do have transactional processing and analytics on the same platform brings many benefits, including:
- Faster, better analytics. Business people can access data the instant it has been updated, without complicated and expensive replication and aggregation. Analytics can be embedded into operational processes without having to worry about data inconsistency because of time lags.
- Faster, better applications. Having a single system makes it easier to analyze, adjust, and optimize operational processes in real-time, rather than after things have gone wrong or opportunities have been missed. The opportunities are endless, from predictive maintenance to personalized, interactive offers for shoppers, or individualized medicine based on genome analysis.
- Lower costs. Yes, in-memory architectures remains more expensive than disk, but having a single reporting system, and the reduced costs of data duplication and manipulation result in a compelling business case even without the business benefits of faster decisions.
- Simplified application architectures. Most operational systems today require layers of cached data tables in order to provide acceptable performance. In-memory technologies hold out the promise of radically simplifying data application architectures. For example the number of data tables in a financial application could be reduced to just two – with all views of the data (balance sheet, etc) calculated on the fly, at any moment.
- More flexibility. Simplified architectures make it update and adapt both analytic and operational systems. Changes require tweaks to metadata rather than dumping, recalculating, and reloading large quantities of data.
- A ‘single source of truth’. This has long been a holy grail of the analytics industry. Reducing data duplication, and making it easier to carry out operational data governance in real time (data quality, master data management) is a big step forward.
Of course, no technology is a silver bullet. In particular, in-memory systems don’t directly fix some of the biggest problems plaguing the provision of business analytics, notably the pains of data integration across multiple incompatible systems, and the politics of deciding common definitions for business concepts across the organization. And in-memory does nothing to ensure that business people actually make best use of the data that is provided.
But is it a big deal that we can now knock down a technical barrier that has plagued the industry for over 40 years? Yes indeed!