Part 3 in the Edge Computing series
It is said that history doesn’t necessarily repeat itself, but it sure does rhyme. The Internet of Things and edge computing, I think, is one of those rhymes—or one of the swings of the pendulum between concepts of centralized and decentralized computing. And for 45 years, SAP has been a key player in these swings, both driving and responding to technology advances. As the pendulum swings today, let’s take a look at how we got here.
SAP, mainframes, and the first centralization
It was in 1972 when the co-founders of SAP landed their first customer—the German branch of Imperial Chemical Industries in Östringen. The job then was to develop and deploy a system for financial accounting, materials and logistics planning, and order processing. This system used punch cards that ran on a mainframe computing system.
The mainframe represents a centralized approach to computing where all compute resources are maintained in one location. Historically speaking, this makes perfect sense. In the early days, compute resources were extremely expensive. They were also big. The first mainframe, built in 1943, weighed in at five tons and ran 51 feet long. Nobody was carrying this machine in a briefcase.
Mainframes also offer centralized administration and optimized data storage on disc. Access to the mainframe, after all, is managed through dumb terminals or thin clients with no processing power—which means there are no pesky personal computers to contend with. All data is stored on the mainframe, with the terminals providing little more than a window into the soul of the machine.
Decentralization and client-server network
Mainframes have not gone away. Credit card companies and airlines, in particular, still use them to send data (credit card information) or display it (flight information) via the dumb terminal. But mainframes are expensive.
At much lower cost, businesses eventually found that they could set up a decentralized, or “distributed,” client-server network, where workloads are partitioned between servers that provide a service and clients that request it. The client-server model took off with the advent of PCs that could process data and perform calculations on their own—which allowed applications to be decentralized. In addition to cost savings, benefits included greater flexibility and mobility, more freedom to choose software from different vendors, and the ability to share information anywhere in the world (provided a secure Internet connection, that is).
As much of the business world moved in this direction, SAP famously presented a new Unix-based, client-server version of SAP R/3 at the CeBit show in Hannover, Germany in 1991. This new model is often credited with helping SAP to break into the American market and beyond to become a truly global company.
The cloud and back again
The story doesn’t end there, of course—because soon enough, along came the cloud. The cloud can be seen as a return to the centralized mainframe model where computing power is moved away from the end users into a centrally managed location. But here’s the twist: because of a fast and reliable Internet, cloud computing is centrally managed in name only—more as a logical concept than a reality.
Compute resources in the cloud, in fact, can be spread out across physical data centers. What’s more, spikes and troughs in capacity demand now can be met with greater flexibility by quickly firing up or decommissioning virtual resources. Companies find this tremendously advantageous. Seeing this coming, SAP moved aggressively to the cloud, both in term of the products it offers and simply as a company itself competing in the digital economy.
Are we repeating ourselves? IoT and the edge
All of which brings us around to the new decentralization with IoT and edge computing. As I covered in my last blog, the model for the IoT is to push connected “things” out to the “edge”—where the edge exists in relation to the core IT assets of the organization deploying the “things” in the first place.
The idea of edge computing is that an IoT gateway sits out on the edge in close proximity to the deployed assets in the field. Returning to the idea of client-server, it can be said here that processing workloads are partitioned between the edge (sort of like the client) and the core (sort of like the server). On the other hand, the mini edge network of things connected to the gateway (apart from the core) is a bit like the mainframe model in that the sensors are like dumb terminals whose sole purpose is to push data to the gateway. Within this mini network, at least, all the real action happens on the gateway—as is true with the mainframe. And, of course, it all comes together with the gateway eventually connecting to the core via the cloud.
Today, SAP is engaged is a big push to help customers seize the advantage with IoT. Part of my responsibility is to focus specifically on the edge computing aspect of IoT. As I look back on the past—both for SAP and the IT industry in general—what I see is that no, history doesn’t really repeat itself. Rather it comes around again and again in a spiral formation, constantly moving upward. When old concepts from the past are applied to the new context, they may sound familiar, but they have an entirely different meaning. And this, I think, is the definition of rhyme.
For more on edge computing, see the new Forrester paper, “Connect the Edge And Core To Power Your Business” here.