Sections

The Super Materials Revolution

Dan Wellers

Thousands of years ago, humans discovered they could heat rocks to get metal, and it defined an epoch. Later, we refined iron into steel, and it changed the course of civilization. More recently, we turned petroleum into plastic, with all that implies. Whenever we create new materials that push the limits of what’s possible, we send the world down an entirely new path.

Today, we’re on the verge of a revolution in materials science that will transform the world yet again. Scientists have developed tools that make it possible to design, build, and shape new “super materials” that will eclipse what we once believed were physical limits, create previously unimaginable opportunities, and expand the capabilities of what we already think of as exponential technologies in ways limited only by our imaginations.

Super strength in a pencil

The materials of the future are already being made in the present. One astonishing example is graphene, derived from the same graphite that’s in the pencil on your desk. A sheet just one atom thick, graphene is essentially two-dimensional. It weighs next to nothing, yet is up to 300 times stronger than steel. It conducts electricity more efficiently and faster than any other material. It dissipates heat faster than any other known material. It’s the only substance on earth that is completely impermeable by gas.

Excitement about graphene’s potential was high from the first, and it’s not ebbing. At least 13 conferences focusing on graphene, 2D substances, and nanotechnology are scheduled for 2016. The European Commission has created Graphene Flagship, Europe’s largest-ever research initiative, to bring graphene into the mainstream by 2026. And researchers have already developed an array of fascinating uses for graphene: new types of sensors, high-performance transistors, composites that are both super-light and super-strong, even a graphene-based gel for spinal cord injuries that can help nerve cells communicate by conducting electricity between them.

In 2015, IBM achieved a breakthrough in carbon nanotubes — graphene rolled into a tubular shape — that opens the door to faster transistors that will pack exponentially more computing power onto a single silicon chip. In fact, taken to its logical conclusion, the ability to shrink transistors to nanoscale could lead to processors that combine vast power and tiny size in a way that could be called “smart dust” (good news for those of us who don’t prioritize good housekeeping).

But that’s not all we’ll be doing with graphene. Here are just a few examples of what researchers say this single super material is likely to bring us in the not-too-distant future:

Transparent future mobile phone in hands. Concept.
  • batteries that last twice as long as they do now and could offer electric cars a 500-mile range on a single charge.
  • solar cells that are up to 1,000 times more efficient
  • clothing that conducts electricity and has wireless connectivity
  • bendable, highly conductive display screens
  • water desalinization using 15 to 50 percent less energy
  • coatings that can be applied to almost any surface that needs protection from water and air
  • meteor-resistant spacecraft and lightweight bulletproof armor, both enabled by graphene’s ability to dissipate energy from incoming projectiles

Marveling at the possibilities

Amazingly, graphene barely scratches the surface. Consider these advanced materials, all of them currently in development, and let yourself marvel at how we might put them to work:

Nanomaterials artificially engineered at molecular scale are giving rise to cotton-blend fabric that kills bacteria or conducts electricity, a coating that makes objects so frictionless they give no tactile feedback, and ceramics that bounce back from extreme pressure.

Recyclable carbon fiber composites that can be turned back into liquid form and remolded will replace the current versions that can only go into landfills when they’re broken.

Ultra-thin silicon circuits will lead to high-performance medical instruments that can be not just worn, but implanted or swallowed.

Flexible solar cells will replace large, unwieldy solar panels with thin film that can go almost anywhere and be incorporated into almost anything, from windows to tents to clothing.

Rechargeable metal-air batteries that can store electricity in grid-scale amounts will bring plentiful low-cost, reliable energy to places that currently have unreliable or no access to the traditional power grid.

Biomaterials will allow us to build robotic structures out of engineered materials that mimic organic ones. Soft materials that can be activated by an electric field will give us a whole new take on the human/machine interface. The next generation of prosthetics, for example, will be more comfortable, more functional, and harder to distinguish from living flesh.

Metamaterials, synthetic composites designed at the inter-atomic level, will have properties not found in nature. Those of you who love Star Trek and/or Harry Potter will be thrilled at this example: Scientists have already created a thin skin of metamaterial that makes whatever it covers undetectable. That’s right—an actual invisibility cloak. (Unfortunately, non-Romulans and Muggles will probably have to wait quite a while for the retail version.)

Designing the future, one molecule at a time

More mind-boggling developments in material science are on their way. The Materials Genome Initiative (MGI) is a multi-agency U.S. government project designed to help American institutions discover, develop, and deploy advanced materials at lower cost and bring them to market in less time. One central part of the initiative is a database attempting to map the hundreds of millions of different combinations of elements on the periodic table so that scientists can use artificial intelligence to predict what properties those combinations will have. As the database grows, scientists can draw on that data to determine how best to combine elements to create new super materials that have specific desired properties.

Of course, no technological advance is without its challenges, and the rise of the super materials is no exception. One technical hurdle that’s already pressing is the need to find ways to integrate graphene into a high-tech world in which industry and academia have already invested trillions of dollars in silicon. That sum is impossible to walk away from, so unless (until?) graphene supplants silicon entirely, factories, production lines, and research centers will have to be retooled so that both materials can co-exist in the same projects.

That said, advanced materials are a fundamental building block for change, so keep your eye on them as they develop. As super materials become exponentially easier to produce, we’ll start to see them in common use — imagine 3D printers that can create new objects with high-performance computing and battery power literally baked in. As they become more common, expect to see them weaving exponential technologies tightly into the fabric of daily life, both literally and figuratively, and bringing us ever-closer to a world of ambient intelligence. And as these foundation-shaking new materials become ubiquitous, it’s likely that they’ll make today’s technological marvels seem like a preschooler’s playthings.

Download the executive brief Super Materials: Building the Impossible

super-materials-thumbnail

To learn more about how exponential technology will affect business and life, see Digital Futures in the Digitalist Magazine.

Comments

About Dan Wellers

Dan Wellers is founder and leader of Digital Futures at SAP, a strategic insights and thought leadership discipline that explores how digital technologies drive exponential change in business and society.

Pulling Cities Into The Future With Blockchain

Dan Wellers , Raimund Gross and Ulrich Scholl

The next wave of the digital economy is just over the horizon, and it could be built on the blockchain.

Blockchain technology has been rapidly growing in influence since 2015, when it became apparent that the technology underlying the relatively arcane concept of cryptocurrency could transform the financial system. By the end of 2016, major players like Bank of America and Goldman Sachs were laying claim to promising blockchain technologies, filing patents at roughly twice the pace they had at the start of the year.

Enthusiasm for blockchain is not just accelerating, but spreading beyond financial services, as SAP and other global organizations consider all the ways it could remove friction and risk in business transactions. From traditional vendors like IBM and Microsoft to leading consultancies including Accenture and Deloitte, some of the world’s biggest companies are acknowledging the many possibilities inherent in the ability to maintain distributed, tamper-proof ledgers that permanently and transparently record transactions. Yet as promising as blockchain already is, the business world may still be underestimating how profoundly it could change transactions, organizations, and industries. It could ultimately change the entire economy.

Trustworthy data and interactions are the cornerstone of the digital economy. As the physical world becomes ever more quantified, being able to guarantee the integrity and provenance of digital and physical assets and the transactions in which they’re involved will become a core competitive advantage — and blockchain is deliberately designed to embed that guarantee in every transaction. Distributed ledgers, smart contracts, and other blockchain technologies could form the foundation on which other exponential technologies combine and scale.

The basic idea is simple: IoT sensors in drones, autonomous vehicles, 3D printers, and augmented/virtual reality gear would collect and record data in blockchain-based decentralized ledgers. This data would be immediately verified and could be made instantly available for use by any application. Smart contracts programmed into the blockchain would then execute business processes by drawing on these vast repositories of live data. Everything could be further automated by adding artificial intelligence into blockchain smart contracts to make decisions without human involvement.

Here are just a few of the possibilities that could be someday realized on a blockchain framework:

  • Democratized design and manufacture: A blockchain-enabled design and manufacturing platform would allow individuals and small businesses to play a larger role in the digital economy. Products designed from scratch in virtual reality, as well as copies of existing objects scanned with machine vision, could be easily bought, sold, shared, or even digitally remixed, at an affordable cost while protecting intellectual property rights. This would be true whether the work was complex multi-material physical products made with distributed 3D printers — or text, music, and images.
  • Autonomous logistics: Intelligent, self-driving delivery vehicles could shuttle products and materials to their destinations, or even use onboard 3D printers to create them in the location where they’re needed, while using blockchain technology to execute and verify every transaction. Machine learning apps programmed into smart contracts, which are also embedded in the blockchain, could optimize routing. This could make the current centralized model of warehousing and logistics obsolete.
  • Distributed commerce: Combining blockchain with virtual reality, 3D scanning and printing, artificial intelligence, and autonomous vehicles could create immersive, personalized shopping experiences anywhere consumers want to have them. Shoppers could grant permission for vendors to access their purchase history, preferences, and other data stored on a blockchain ledger. Vendor AIs could then generate more accurate recommendations and interact with ecommerce bots that complete purchases automatically. Customers would receive promotions for new styles, medication refills, or replacement parts without even having to think about it. Critically, blockchain would allow buyers to limit access to their personal or proprietary data to specific organizations over a defined period of time, for example, until the end of their shopping experience or the close of their fiscal year.

This may seem like far-future speculation, but a provocative white paper from consulting firm Outlier Ventures Research claims this shift is both inevitable and already underway.

Envisioning the future city

The more technologies we connect using the blockchain as a framework, the more value we can derive. Imagine that a city has a digital ledger in which every house or apartment has a presence containing all relevant information about the home, from property ownership and mortgage balance to transactional data like utility use, property tax assessment, and past and current contractor relationships. The city could access this “digital twin” to coordinate services and perform administrative tasks related to the property more efficiently and with greater accuracy. The property owner would have a verified, trustworthy way to perform transactions like renting a room, hiring contractors to do lawn work, or selling power generated by solar panels back to the grid. The city utility company could feed power consumption data into an AI to generate energy-saving recommendations, and leverage smart contracts that automatically manage power consumption between smart appliances and the grid to lower costs and improve energy efficiency.

By linking together multiple technologies, this “smart city” could then begin to automate basic city services. For example, IoT sensors could instantly sense a problem (say, a downed electrical cable) and alert the appropriate city agency’s AI to dispatch a technician. The AI might help the technician assess the necessary repair through AR glasses, send templates for parts to the 3D printer in the technician’s truck, reimburse the parts designer through a smart contract, and guide the repair via the AR glasses before finally informing the city agency and property owner when the repair is complete.

Now imagine extending that to the city’s broader infrastructure. A business traveler hops into an autonomous electric taxi at the airport and tells it to take her to a meeting in the city center. Knowing from traffic sensor data that there’s been an accident on the highway, the car automatically chooses an alternate route that ends at the parking lot nearest its destination with an available outlet for charging. As the car parks itself, it connects to an outlet that bills the taxi company in real time for the amount of electricity needed to top up the car battery. As the traveler leaves the parking lot and connects to the city’s public wifi via a social media account, she immediately receives a push notification with a discount at the nearby coffee shop. She stops for coffee and heads for her destination, where the elevator recognizes her phone and automatically takes her to the correct floor for her meeting, right on time.

Meanwhile, city staff can monitor the taxi’s safe operation and ensure the taxi company bills accurately for the ride, check traffic status and push out notifications to all affected drivers, make sure parking is available, confirm the traveler’s opt-in agreement for city wifi, provide the coffee shop’s owner with information on the effectiveness of the day’s coupon, and confirm that the building’s elevators are functioning according to the latest safety codes. Every interaction is transparent, verifiable, and nearly impossible to fake or alter — and just as importantly, it adds to a vast store of data the city can then use machine learning to analyze for future improvements and efficiencies.

A multitude of possibilities

The disruptive potential of already exponential technologies multiplies by orders of magnitude when they can intersect and combine. With blockchain creating the framework for that to happen, it’s not entirely hyperbole to put the potential economic transformation on par with the Industrial Revolution. But companies can’t simply wait until digital transformation is upon us.  Organizations need to start right now to think through the likely impacts in a disciplined and proactive way. Developing scenarios for the multitude of possibilities prepares us to maximize positive outcomes.

Read the executive brief Running Future Cities on Blockchain.

Learn how your business can Run Live with blockchain.


Comments

About Dan Wellers

Dan Wellers is founder and leader of Digital Futures at SAP, a strategic insights and thought leadership discipline that explores how digital technologies drive exponential change in business and society.

Raimund Gross

About Raimund Gross

Raimund Gross is a solution architect and futurist at SAP Innovation Center Network, where he evaluates emerging technologies and trends to address the challenges of businesses arising from digitization. He is currently evaluating the impact of blockchain for SAP and our enterprise customers.

Ulrich Scholl

About Ulrich Scholl

Ulrich Scholl is Vice President of Industry Cloud and Custom Development at SAP. In this role, Ulrich discovers and implements best practices to help further the understanding and adoption of the SAP portfolio of industry cloud innovations.

The Future of Learning - Keeping up With The Digital Economy

Dan Wellers and Michael Rander

Learning is perhaps the only area still largely untouched by digital transformation. It’s not just that curriculums aren’t keeping up with the skills required for a future of exponential change in which skills learned today can be obsolete in years or even months. Our entire standard approach to education — top-down, one-size-fits-most, heavily biased against collaboration, and generally ending in young adulthood at the latest — has barely changed since the industrial revolution.  No wonder the status quo is a poor match for an imminent future in which entire groups of people within specific job types and industries will be made redundant by automation and will desperately need new skills to adapt to the changing workplace.

Granted, it’s now possible to download smartphone apps that turn foreign language learning into a game, squeeze bite-sized lessons in everything from history to coding into ten-minute blocks of free time, or quantify various non-classroom activities as work-related training. But while these technologies can be efficient tools to help individuals acquire specific new skills and prove what they already know, they ignore the much more pressing and universal issue: the future is digital, and anyone whose skills are insufficient, inadequate, or outmoded will be left behind. If we hope to have a real impact and avert the potential disaster of massive, permanent global unemployment, we must also radically rethink learning at the societal level.

Technology is not enough

Researchers at the University of Southern California are working to develop a cognitive neural prosthesis they hope will allow people with traumatic brain injuries to literally download muscle memory and motor function. If it works, we may in the future be able to buy or rent knowledge as we need it and import it into our minds in minutes, Matrix-style.

That’s an enormous “if,” though, and it’s not going to come soon enough.

USC research notwithstanding, a true learning revolution seems unlikely to arise in the near future. Indeed, our fundamental models of acquiring knowledge have changed very little in the last 15 years. Yet in the same timeframe, the digital economy has convulsed the workplace.

Large global organizations are already struggling with the inability to find or train enough employees with the right types of skills to keep them competitive. At the same time, digitization is affecting entire industries in ways so rapid and profound that it could be described as an extinction event. We are, for example, probably no more than a decade away from a wrenching dislocation in trucking, taxis, delivery services, and other transportation-based businesses as truly autonomous vehicles make human delivery drivers a relic of a bygone era.

These dramatic changes in the workforce are hollowing out the middle class and creating a need that cannot be filled by our current systems of learning. Instead, we must work together to address them at a systemic level.

The true question for forward-thinking companies

The question we face now, as individuals, as businesses, and as a society, is what we will do when the digital economy ejects vast numbers of people from their jobs, and they lack the skills needed to find new ones. If we hope to address this question, the standard model of learning needs to change — and business needs to take a leading role in driving that change, says Jenny Dearborn, Senior Vice President and Chief Learning Officer for SAP.

For example, she notes, the 100 largest companies in the United States collectively employ 18 million people, yet the entire country spends only .1% of GDP on job retraining, workforce development centers, and adult education subsidies.

Much of that spending is in community or corporate siloes. For example, many organizations are expanding their talent pool by reaching out to local schools and colleges to improve collaboration. SAP is already taking steps in this direction through high school partnerships such as those with Skyline High School in Oakland, California, and Business Technology Early College High School in Queens, New York, where students get hands-on workplace training from SAP employee mentors and graduate with both a high school diploma and a technology-focused associate degree from a local two-year college.

These piecemeal and individual corporate efforts are successful, but they can only go so far. And this issue isn’t limited to the United States; it’s a global economic imperative in which every major employer worldwide has a responsibility to participate.

A higher degree of learning

So what might we do? Dearborn has a few suggestions:

We could create bold programs to educate adults and advance innovation. China is reportedly investing $250 billion a year in young adult education. The US did something similar after World War II by introducing the GI Bill. Why not revisit the idea as a national or even multinational program?

We could invest more in technical and vocational teachers, both to recruit more to the field and to increase the number of students they can serve.

We could revive the apprentice model and implement it at a much broader level in the corporate world. This would give students and new graduates much earlier exposure to the real world of work, while allowing them to earn both experience and a salary in the process. Hilton Worldwide, for example, offers a range of apprenticeship programs in Europe for young people who want to work in the hotel industry.

We could increase tax incentives for investing in proven effective methods of closing skills gaps: internal employee training and development, involvement with local schools and communities, external training and certification programs, and veteran hiring programs.

Most of all, we could break down the geographic and political siloes that hinder adult job and skills training and retraining programs. Imagine the impact the CLOs of those 100 leading US companies could have if they worked together. Now imagine expanding that worldwide. In addition to coordinating among CLOs, employers must work with national, regional, and local programs that target adult learners so we can multiply effectiveness, eliminate redundancies, and share best practices.

Doing this will help every business compete in an increasingly global economy with a tight market for skills, but it will also have a much broader outcome. Since business has helped to create the workplace risks and challenges of the digital economy, we also bear some collective responsibility for mitigating them. In doing so, we will have enormous influence in shaping the future of learning — and the future of business itself.

Read the executive brief Taking Learning Back to School.


Comments

About Dan Wellers

Dan Wellers is founder and leader of Digital Futures at SAP, a strategic insights and thought leadership discipline that explores how digital technologies drive exponential change in business and society.

About Michael Rander

Michael Rander is the Global Research Director for Future Of Work at SAP. He is an experienced project manager, strategic and competitive market researcher, operations manager as well as an avid photographer, athlete, traveler and entrepreneur. Share your thoughts with Michael on Twitter @michaelrander.

Data Lakes: Deep Insights

Timo Elliott, John Schitka, Michael Eacrett, and Carolyn Marsan

Dan McCaffrey has an ambitious goal: solving the world’s looming food shortage.

As vice president of data and analytics at The Climate Corporation (Climate), which is a subsidiary of Monsanto, McCaffrey leads a team of data scientists and engineers who are building an information platform that collects massive amounts of agricultural data and applies machine-learning techniques to discover new patterns. These analyses are then used to help farmers optimize their planting.

“By 2050, the world is going to have too many people at the current rate of growth. And with shrinking amounts of farmland, we must find more efficient ways to feed them. So science is needed to help solve these things,” McCaffrey explains. “That’s what excites me.”

“The deeper we can go into providing recommendations on farming practices, the more value we can offer the farmer,” McCaffrey adds.

But to deliver that insight, Climate needs data—and lots of it. That means using remote sensing and other techniques to map every field in the United States and then combining that information with climate data, soil observations, and weather data. Climate’s analysts can then produce a massive data store that they can query for insights.

Meanwhile, precision tractors stream data into Climate’s digital agriculture platform, which farmers can then access from iPads through easy data flow and visualizations. They gain insights that help them optimize their seeding rates, soil health, and fertility applications. The overall goal is to increase crop yields, which in turn boosts a farmer’s margins.

Climate is at the forefront of a push toward deriving valuable business insight from Big Data that isn’t just big, but vast. Companies of all types—from agriculture through transportation and financial services to retail—are tapping into massive repositories of data known as data lakes. They hope to discover correlations that they can exploit to expand product offerings, enhance efficiency, drive profitability, and discover new business models they never knew existed.

The internet democratized access to data and information for billions of people around the world. Ironically, however, access to data within businesses has traditionally been limited to a chosen few—until now. Today’s advances in memory, storage, and data tools make it possible for companies both large and small to cost effectively gather and retain a huge amount of data, both structured (such as data in fields in a spreadsheet or database) and unstructured (such as e-mails or social media posts). They can then allow anyone in the business to access this massive data lake and rapidly gather insights.

It’s not that companies couldn’t do this before; they just couldn’t do it cost effectively and without a lengthy development effort by the IT department. With today’s massive data stores, line-of-business executives can generate queries themselves and quickly churn out results—and they are increasingly doing so in real time. Data lakes have democratized both the access to data and its role in business strategy.

Indeed, data lakes move data from being a tactical tool for implementing a business strategy to being a foundation for developing that strategy through a scientific-style model of experimental thinking, queries, and correlations. In the past, companies’ curiosity was limited by the expense of storing data for the long term. Now companies can keep data for as long as it’s needed. And that means companies can continue to ask important questions as they arise, enabling them to future-proof their strategies.

Prescriptive Farming

Climate’s McCaffrey has many questions to answer on behalf of farmers. Climate provides several types of analytics to farmers including descriptive services, which are metrics about the farm and its operations, and predictive services related to weather and soil fertility. But eventually the company hopes to provide prescriptive services, helping farmers address all the many decisions they make each year to achieve the best outcome at the end of the season. Data lakes will provide the answers that enable Climate to follow through on its strategy.

Behind the scenes at Climate is a deep-science data lake that provides insights, such as predicting the fertility of a plot of land by combining many data sets to create accurate models. These models allow Climate to give farmers customized recommendations based on how their farm is performing.

“Machine learning really starts to work when you have the breadth of data sets from tillage to soil to weather, planting, harvest, and pesticide spray,” McCaffrey says. “The more data sets we can bring in, the better machine learning works.”

The deep-science infrastructure already has terabytes of data but is poised for significant growth as it handles a flood of measurements from field-based sensors.

“That’s really scaling up now, and that’s what’s also giving us an advantage in our ability to really personalize our advice to farmers at a deeper level because of the information we’re getting from sensor data,” McCaffrey says. “As we roll that out, our scale is going to increase by several magnitudes.”

Also on the horizon is more real-time data analytics. Currently, Climate receives real-time data from its application that streams data from the tractor’s cab, but most of its analytics applications are run nightly or even seasonally.

In August 2016, Climate expanded its platform to third-party developers so other innovators can also contribute data, such as drone-captured data or imagery, to the deep-science lake.

“That helps us in a lot of ways, in that we can get more data to help the grower,” McCaffrey says. “It’s the machine learning that allows us to find the insights in all of the data. Machine learning allows us to take mathematical shortcuts as long as you’ve got enough data and enough breadth of data.”

Predictive Maintenance

Growth is essential for U.S. railroads, which reinvest a significant portion of their revenues in maintenance and improvements to their track systems, locomotives, rail cars, terminals, and technology. With an eye on growing its business while also keeping its costs down, CSX, a transportation company based in Jacksonville, Florida, is adopting a strategy to make its freight trains more reliable.

In the past, CSX maintained its fleet of locomotives through regularly scheduled maintenance activities, which prevent failures in most locomotives as they transport freight from shipper to receiver. To achieve even higher reliability, CSX is tapping into a data lake to power predictive analytics applications that will improve maintenance activities and prevent more failures from occurring.

Beyond improving customer satisfaction and raising revenue, CSX’s new strategy also has major cost implications. Trains are expensive assets, and it’s critical for railroads to drive up utilization, limit unplanned downtime, and prevent catastrophic failures to keep the costs of those assets down.

That’s why CSX is putting all the data related to the performance and maintenance of its locomotives into a massive data store.

“We are then applying predictive analytics—or, more specifically, machine-learning algorithms—on top of that information that we are collecting to look for failure signatures that can be used to predict failures and prescribe maintenance activities,” says Michael Hendrix, technical director for analytics at CSX. “We’re really looking to better manage our fleet and the maintenance activities that go into that so we can run a more efficient network and utilize our assets more effectively.”

“In the past we would have to buy a special storage device to store large quantities of data, and we’d have to determine cost benefits to see if it was worth it,” says Donna Crutchfield, assistant vice president of information architecture and strategy at CSX. “So we were either letting the data die naturally, or we were only storing the data that was determined to be the most important at the time. But today, with the new technologies like data lakes, we’re able to store and utilize more of this data.”

CSX can now combine many different data types, such as sensor data from across the rail network and other systems that measure movement of its cars, and it can look for correlations across information that wasn’t previously analyzed together.

One of the larger data sets that CSX is capturing comprises the findings of its “wheel health detectors” across the network. These devices capture different signals about the bearings in the wheels, as well as the health of the wheels in terms of impact, sound, and heat.

“That volume of data is pretty significant, and what we would typically do is just look for signals that told us whether the wheel was bad and if we needed to set the car aside for repair. We would only keep the raw data for 10 days because of the volume and then purge everything but the alerts,” Hendrix says.

With its data lake, CSX can keep the wheel data for as long as it likes. “Now we’re starting to capture that data on a daily basis so we can start applying more machine-learning algorithms and predictive models across a larger history,” Hendrix says. “By having the full data set, we can better look for trends and patterns that will tell us if something is going to fail.”

Another key ingredient in CSX’s data set is locomotive oil. By analyzing oil samples, CSX is developing better predictions of locomotive failure. “We’ve been able to determine when a locomotive would fail and predict it far enough in advance so we could send it down for maintenance and prevent it from failing while in use,” Crutchfield says.

“Between the locomotives, the tracks, and the freight cars, we will be looking at various ways to predict those failures and prevent them so we can improve our asset allocation. Then we won’t need as many assets,” she explains. “It’s like an airport. If a plane has a failure and it’s due to connect at another airport, all the passengers have to be reassigned. A failure affects the system like dominoes. It’s a similar case with a railroad. Any failure along the road affects our operations. Fewer failures mean more asset utilization. The more optimized the network is, the better we can service the customer.”

Detecting Fraud Through Correlations

Traditionally, business strategy has been a very conscious practice, presumed to emanate mainly from the minds of experienced executives, daring entrepreneurs, or high-priced consultants. But data lakes take strategy out of that rarefied realm and put it in the environment where just about everything in business seems to be going these days: math—specifically, the correlations that emerge from applying a mathematical algorithm to huge masses of data.

The Financial Industry Regulatory Authority (FINRA), a nonprofit group that regulates broker behavior in the United States, used to rely on the experience of its employees to come up with strategies for combating fraud and insider trading. It still does that, but now FINRA has added a data lake to find patterns that a human might never see.

Overall, FINRA processes over five petabytes of transaction data from multiple sources every day. By switching from traditional database and storage technology to a data lake, FINRA was able to set up a self-service process that allows analysts to query data themselves without involving the IT department; search times dropped from several hours to 90 seconds.

While traditional databases were good at defining relationships with data, such as tracking all the transactions from a particular customer, the new data lake configurations help users identify relationships that they didn’t know existed.

Leveraging its data lake, FINRA creates an environment for curiosity, empowering its data experts to search for suspicious patterns of fraud, marketing manipulation, and compliance. As a result, FINRA was able to hand out 373 fines totaling US$134.4 million in 2016, a new record for the agency, according to Law360.

Data Lakes Don’t End Complexity for IT

Though data lakes make access to data and analysis easier for the business, they don’t necessarily make the CIO’s life a bed of roses. Implementations can be complex, and companies rarely want to walk away from investments they’ve already made in data analysis technologies, such as data warehouses.

“There have been so many millions of dollars going to data warehousing over the last two decades. The idea that you’re just going to move it all into a data lake isn’t going to happen,” says Mike Ferguson, managing director of Intelligent Business Strategies, a UK analyst firm. “It’s just not compelling enough of a business case.” But Ferguson does see data lake efficiencies freeing up the capacity of data warehouses to enable more query, reporting, and analysis.

Data lakes also don’t free companies from the need to clean up and manage data as part of the process required to gain these useful insights. “The data comes in very raw, and it needs to be treated,” says James Curtis, senior analyst for data platforms and analytics at 451 Research. “It has to be prepped and cleaned and ready.”

Companies must have strong data governance processes, as well. Customers are increasingly concerned about privacy, and rules for data usage and compliance have become stricter in some areas of the globe, such as the European Union.

Companies must create data usage policies, then, that clearly define who can access, distribute, change, delete, or otherwise manipulate all that data. Companies must also make sure that the data they collect comes from a legitimate source.

Many companies are responding by hiring chief data officers (CDOs) to ensure that as more employees gain access to data, they use it effectively and responsibly. Indeed, research company Gartner predicts that 90% of large companies will have a CDO by 2019.

Data lakes can be configured in a variety of ways: centralized or distributed, with storage on premise or in the cloud or both. Some companies have more than one data lake implementation.

“A lot of my clients try their best to go centralized for obvious reasons. It’s much simpler to manage and to gather your data in one place,” says Ferguson. “But they’re often plagued somewhere down the line with much more added complexity and realize that in many cases the data lake has to be distributed to manage data across multiple data stores.”

Meanwhile, the massive capacities of data lakes mean that data that once flowed through a manageable spigot is now blasting at companies through a fire hose.

“We’re now dealing with data coming out at extreme velocity or in very large volumes,” Ferguson says. “The idea that people can manually keep pace with the number of data sources that are coming into the enterprise—it’s just not realistic any more. We have to find ways to take complexity away, and that tends to mean that we should automate. The expectation is that the information management software, like an information catalog for example, can help a company accelerate the onboarding of data and automatically classify it, profile it, organize it, and make it easy to find.”

Beyond the technical issues, IT and the business must also make important decisions about how data lakes will be managed and who will own the data, among other things (see How to Avoid Drowning in the Lake).

How to Avoid Drowning in the Lake

The benefits of data lakes can be squandered if you don’t manage the implementation and data ownership carefully.

Deploying and managing a massive data store is a big challenge. Here’s how to address some of the most common issues that companies face:

Determine the ROI. Developing a data lake is not a trivial undertaking. You need a good business case, and you need a measurable ROI. Most importantly, you need initial questions that can be answered by the data, which will prove its value.

Find data owners. As devices with sensors proliferate across the organization, the issue of data ownership becomes more important.

Have a plan for data retention. Companies used to have to cull data because it was too expensive to store. Now companies can become data hoarders. How long do you store it? Do you keep it forever?

Manage descriptive data. Software that allows you to tag all the data in one or multiple data lakes and keep it up-to-date is not mature yet. We still need tools to bring the metadata together to support self-service and to automate metadata to speed up the preparation, integration, and analysis of data.

Develop data curation skills. There is a huge skills gap for data repository development. But many people will jump at the chance to learn these new skills if companies are willing to pay for training and certification.

Be agile enough to take advantage of the findings. It used to be that you put in a request to the IT department for data and had to wait six months for an answer. Now, you get the answer immediately. Companies must be agile to take advantage of the insights.

Secure the data. Besides the perennial issues of hacking and breaches, a lot of data lakes software is open source and less secure than typical enterprise-class software.

Measure the quality of data. Different users can work with varying levels of quality in their data. For example, data scientists working with a huge number of data points might not need completely accurate data, because they can use machine learning to cluster data or discard outlying data as needed. However, a financial analyst might need the data to be completely correct.

Avoid creating new silos. Data lakes should work with existing data architectures, such as data warehouses and data marts.

From Data Queries to New Business Models

The ability of data lakes to uncover previously hidden data correlations can massively impact any part of the business. For example, in the past, a large soft drink maker used to stock its vending machines based on local bottlers’ and delivery people’s experience and gut instincts. Today, using vast amounts of data collected from sensors in the vending machines, the company can essentially treat each machine like a retail store, optimizing the drink selection by time of day, location, and other factors. Doing this kind of predictive analysis was possible before data lakes came along, but it wasn’t practical or economical at the individual machine level because the amount of data required for accurate predictions was simply too large.

The next step is for companies to use the insights gathered from their massive data stores not just to become more efficient and profitable in their existing lines of business but also to actually change their business models.

For example, product companies could shield themselves from the harsh light of comparison shopping by offering the use of their products as a service, with sensors on those products sending the company a constant stream of data about when they need to be repaired or replaced. Customers are spared the hassle of dealing with worn-out products, and companies are protected from competition as long as customers receive the features, price, and the level of service they expect. Further, companies can continuously gather and analyze data about customers’ usage patterns and equipment performance to find ways to lower costs and develop new services.

Data for All

Given the tremendous amount of hype that has surrounded Big Data for years now, it’s tempting to dismiss data lakes as a small step forward in an already familiar technology realm. But it’s not the technology that matters as much as what it enables organizations to do. By making data available to anyone who needs it, for as long as they need it, data lakes are a powerful lever for innovation and disruption across industries.

“Companies that do not actively invest in data lakes will truly be left behind,” says Anita Raj, principal growth hacker at DataRPM, which sells predictive maintenance applications to manufacturers that want to take advantage of these massive data stores. “So it’s just the option of disrupt or be disrupted.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Timo Elliott is Vice President, Global Innovation Evangelist, at SAP.

John Schitka is Senior Director, Solution Marketing, Big Data Analytics, at SAP.

Michael Eacrett is Vice President, Product Management, Big Data, Enterprise Information Management, and SAP Vora, at SAP.

Carolyn Marsan is a freelance writer who focuses on business and technology topics.

Comments

About Timo Elliott

Timo Elliott is an Innovation Evangelist for SAP and a passionate advocate of innovation, digital business, analytics, and artificial intelligence. He was the eighth employee of BusinessObjects and for the last 25 years he has worked closely with SAP customers around the world on new technology directions and their impact on real-world organizations. His articles have appeared in articles such as Harvard Business Review, Forbes, ZDNet, The Guardian, and Digitalist Magazine. He has worked in the UK, Hong Kong, New Zealand, and Silicon Valley, and currently lives in Paris, France. He has a degree in Econometrics and a patent in mobile analytics. 

Tags:

Artificial Intelligence: The Future Of Oil And Gas

Anoop Srivastava

Oil prices have fallen dramatically over last few years, forcing some major oil companies to take drastic actions such as layoffs, cutting investments and budgets, and more. Shell, for example, shelved its plan to invest in Qatar, Aramco put on hold its deep-water exploration in the Red Sea, Schlumberger fired a few thousand employees, and the list goes on…

In view of falling oil prices and the resulting squeeze on cash flows, the oil and gas industry has been challenged to adapt and optimize its performance to remain profitable while maintaining a long-term investment and operating outlook. Currently, oil and gas companies find it difficult to maintain the same level of investment in exploration and production as when crude prices were at their peak. Operations in the oil and gas industry today means balancing a dizzying array of trade-offs in the drive for competitive advantage while maximizing return on investment.

The result is a dire need to optimize performance and optimize the cost of production per barrel. Companies have many optimization opportunities once they start using the massive data being generated by oil fields. Oil and gas companies can turn this crisis into an opportunity by leveraging technological innovations like artificial intelligence to build a foundation for long-term success. If volatility in oil prices is the new norm, the push for “value over volume” is the key to success going forward.

Using AI tools, upstream oil and gas companies can shift their approach from production at all costs to producing in context. They will need to do profit and loss management at the well level to optimize the production cost per barrel. To do this, they must integrate all aspects of production management, collect the data for analysis and forecasting, and leverage artificial intelligence to optimize operations.

When remote sensors are connected to wireless networks, data can be collected and centrally analyzed from any location. According to the consulting firm McKinsey, the oil and gas supply chain stands to gain $50 billion in savings and increased profit by adopting AI. As an example, using AI algorithms to more accurately sift through signals and noise in seismic data can decrease dry wellhead development by 10 percent.

How oil and gas can leverage artificial intelligence

1. Planning and forecasting

On a macro scale, deep machine learning can help increase awareness of macroeconomic trends to drive investment decisions in exploration and production. Economic conditions and even weather patterns can be considered to determine where investments should take place as well as intensity of production.

2. Eliminate costly risks in drilling

Drilling is an expensive and risky investment, and applying AI in the operational planning and execution stages can significantly improve well planning, real-time drilling optimization, frictional drag estimation, and well cleaning predictions. Additionally, geoscientists can better assess variables such as the rate of penetration (ROP) improvement, well integrity, operational troubleshooting, drilling equipment condition recognition, real-time drilling risk recognition, and operational decision-making.

When drilling, machine-learning software takes into consideration a plethora of factors, such as seismic vibrations, thermal gradients, and strata permeability, along with more traditional data such as pressure differentials. AI can help optimize drilling operations by driving decisions such as direction and speed in real time, and it can predict failure of equipment such as semi-submersible pumps (ESPs) to reduce unplanned downtime and equipment costs.

3. Well reservoir facility management

Wells, reservoirs, and facility management includes integration of multiple disciplines: reservoir engineering, geology, production technology, petro physics, operations, and seismic interpretation. AI can help to create tools that allow asset teams to build professional understanding and identify opportunities to improve operational performance.

AI techniques can also be applied in other activities such as reservoir characterization, modeling and     field surveillance. Fuzzy logic, artificial neural networks and expert systems are used extensively across the industry to accurately characterize reservoirs in order to attain optimum production level.

Today, AI systems form the backbone of digital oil field (DOF) concepts and implementations. However, there is still great potential for new ways to optimize field development and production costs, prolong field life, and increase the recovery factor.

4. Predictive maintenance

Today, artificial intelligence is taking the industry by storm. AI-powered software and sensor hardware enables us to use very large amounts of data to gain real-time responses on the best future course of action. With predictive analytics and cognitive security, for example, oil and gas companies can operate equipment safely and securely while receiving recommendations on how to avoid future equipment failure or mediate potential security breaches.

5. Oil and gas well surveying and inspections

Drones have been part of the oil and gas industry since 2013, when ConocoPhillips used the Boeing ScanEagle drone in trials in the Chukchi Sea.  In June 2014, the Federal Aviation Administration (FAA) issued the first commercial permit for drone use over United States soil to BP, allowing the company to survey pipelines, roads, and equipment in Prudhoe Bay, Alaska. In January, Sky-Futures completed the first drone inspection in the Gulf of Mexico.

While drones are primarily used in the midstream sector, they can be applied to almost every aspect of the industry, including land surveying and mapping, well and pipeline inspections, and security. Technology is being developed to enable drones to detect early methane leaks. In addition, one day, drones could be used to find oil and gas reservoirs underlying remote uninhabited regions, from the comfort of a warm office.

6. Remote logistics

As logistics to offshore locations is always a challenge, AI-enhanced drones can be used to deliver materials to remote offshore locations.

Current adoption of AI

Chevron is currently using AI to identify new well locations and simulation candidates in California. By using AI software to analyze the company’s large collection of historical well performance data, the company is drilling in better locations and has seen production rise 30% over conventional methods. Chevron is also using predictive models to analyze the performance of thousands of pieces of rotating equipment to detect failures before they occur. By addressing problems before they become critical, Chevron has avoided unplanned shutdowns and lowered repair expenses. Increased production and lower costs have translated to more profit per well.

Future journey

Today’s oil and gas industry has been transformed by two industry downturns in one decade. Although adoption of new hard technology such as directional drilling and hydraulic fracturing (fracking) has helped, the oil and gas industry needs to continue to innovate in today’s low-price market to survive. AI has the potential to differentiate companies that thrive and those that are left behind.

The promise of AI is already being realized in the oil and gas industry. Early adopters are taking advantage of their position  to get a head start on the competition and protect their assets. The industry has always leveraged technology to adapt to change, and early adopters have always benefited the most. As competition in the oil and gas industry continues to heat up, companies cannot afford to be left behind. For those that understand and seize the opportunities inherent in adopting cognitive technologies, the future looks bright.

For more insight on advanced technology in the energy sector, see How Digital Transformation Is Refueling The Energy Industry.

Comments

Anoop Srivastava

About Anoop Srivastava

Anoop Srivastava is Senior Director of the Energy and Natural Resources Industries at SAP Value Engineering in Middle East and North Africa. He advises clients on their digital transformation strategies and helps them align their business strategy with IT strategy leveraging digital technology innovations such as the Internet of Things, Big Data, Advanced Analytics, Cloud etc. He has 21+ years of work experience spanning across Oil& Gas Industry, Business Consulting, Industry Value Advisory and Digital Transformation.