Analytics, mobile apps, social networks, cloud computing, and the Internet of Things are quickly turning traditional business models upside-down. As digitization continues to move from an innovative trend to an all-encompassing reality, companies need to understand and take advantage of digital technology across all aspects of their operations. Otherwise, they run the risk of limiting their competitiveness in delivering on market expectations and creating products, services, and delivery models that bring value to customers and the business.
According to The Economist Intelligence Unit (EIU) report “Digitising IT,” sponsored by SAP, 97% of business and IT leaders are engaging their organizations in a digital initiative. Yet, rarely is there a comprehensive digital strategy present. Although 86% of respondents believe that digital transformation is the most important action to achieve better outcomes from their digital investments, only half of them believe that they fully understand it. Under these circumstances, digital transformation is, well, daunting.
Enterprise support value maps: A new phase in support service and IT expertise
There are many ways to reach a destination, but it is often difficult to find the right path. A full understanding of the potential of digital technology enables companies to make informed decisions about their path to digital and in which areas to invest. With the right guidance and digital skills, businesses can reduce the time to value and benefit from concrete and demonstrable outcomes.
Enterprise support value maps provide support and guided access to the knowledge, skills, and services needed to drive innovation. Covering topics such as custom code management, security, and data volume management, value maps empower members to build up their digital proficiency and prepare the IT landscape for innovation. And with topics such as in-memory computing and digital core platform implementation and cloud solutions, employees are prepared to complete the next step in their business’ digital transformation journey.
With this insight, businesses can accelerate their digital agenda by answering critical questions such as:
Which services and tools can best address my business challenges?
Which support services and on-demand expertise can help empower my employees?
How long will it take and what skill level do I need?
What is the best approach to prepare for and adopt innovation?
How can IT optimize software landscape operations and maximize efficiency while ensuring data security and regulatory compliance?
Michael Kleinemeier, member of the SAP Executive Board of SAP SE and head of the Digital Business Services organization at SAP, says, “the feedback from our customers and research vendors, such as Brandon Hall Group, speaks for itself. Value maps help drive digital transformation, unlock business value, and realize measurable benefits by integrating processes, operations, and technology to create measurable benefits. This methodology for future-oriented social media-based learning and support is a key part of successful digital transformations.”
Take, for example, Carsa S.A., a leading retail company in Argentina that wanted to innovate in the cloud, integrate SAP SuccessFactors into their landscape, and leverage integration capabilities between cloud and on-premises solutions. The company decided to keep things as simple as possible, follow best practices, and reduce total cost of ownership. By registering for and participating in the value map for cloud and hybrid implementations, Carsa was well-equipped to make strategic decisions, having upskilled with best practices, documents, meet-the-expert sessions, expert and peer customer collaboration through social collaboration, and much more.
With the support of the enterprise support advisory center and the enterprise support value map for cloud and hybrid initiatives, Carsa improved its ability to adopt an integration strategy and determine integration tools that best fits its needs. This level of insight frees the company to dedicate 30% more time and resources towards driving innovation and make better use of its human capital management cloud solutions.
Mapping a clear path to competitive, value-add digital transformation
While digital transformation is a fundamental requirement to survive and grow, all-encompassing, business-wide change typically takes a long time to execute. Unfortunately, your market, stakeholders, customers, and competitors are not willing to wait.
With enterprise support value maps, you can accelerate the reinvention of your business through a broad range of services, tools, best practices, and expertise. Best of all, this level of insight empowers your workforce to resolve business challenges and drive competitive, value-add outcomes.
In a recent Q&A with SAP, “The Value of Data and Analytics in Digital Transformation,” Dan Vesset at IDC makes an interesting observation. Deep into the exchange, he points out that today, “there is significantly greater acceptance by IT that it shouldn’t control all things analytics.”
The reasons why are intriguing. IT control, of course, is a long-standing issue when it comes to enterprise technology in general – and models have swung from one extreme to the other. In the interest of security, some IT groups wield considerable control over who uses what technology. In the interest of flexibility, others are quite open – allowing almost anyone to access anything with minimal controls.
Ideally, you want to find the sweet spot that balances these two extremes so that your company can minimize risk on the one hand while enabling the flexibility to innovate and serve customers more effectively on the other. This, I think, is fairly obvious. But the dichotomy between security and flexibility is not really what Dan has in mind.
Complexity and control
What he has in mind is complexity. The fact is, many IT organizations exert control over enterprise technology due to its complexity. Let’s call it “control by necessity.” A classic case is analytics, which until recently has almost always been no more than a step away from the complexities of data management. Analytical tools were complex – because using them required a fair amount of data management expertise.
In the past, analytics involved data warehouses stored on disk where experts ran batch jobs on data subsets and generated reports that were then delivered to the business. Dashboards were a nice advance, but IT had to build them, and they quickly lost relevance. IT controlled analytics because they needed to.
The new face of analytics
Today, things have shifted. Today, businesses are using new technologies in new ways to bring analytics directly to the consumer. Dan runs through a few examples:
Advances in user interface design: Today’s state-of-the-art UIs take a cue from the mobile world, where data can be accessed and manipulated with touch-screen simplicity. Not only are interfaces increasingly designed with the user in mind; they’re designed by the user to meet individual needs and preferences.
In-memory databases: Rather than storing transactional data and analytical data in separate silos, companies can now store both kinds of data in active memory, where it is easier to work with. This is helping companies sense and respond to developments faster and more effectively.
Cloud computing: Modern analytics, Dan says, is architected for the cloud. New data platforms that run in the cloud – or at least deliver analytics for consumption via the cloud – are helping companies meet demand for data by business users with greater flexibility at lower costs.
Machine learning: Companies can now use intelligent algorithms to analyze process data, identify issues, and take action to improve processes – often without human intervention. This only makes things easier for consumers of analytics, who can now spend more time on higher-value tasks.
Bring analytics to the masses – but start small
Surely other technologies and trends have played a role in simplifying analytics for the business consumer. And we can expect more technologies to emerge over time. But whatever specific technologies a company chooses to adopt, Dan warns against big-bang digital transformation projects that are implemented enterprise-wide for ill-defined reasons.
Most companies are better off with targeted projects that fulfill defined needs or answer specific questions. Fortunately, today’s leading-edge cloud analytics platforms are designed for rapid expansion. The best approach is to get comfortable first, expand as needed, and then evaluate if cloud analytics is something that could help enterprise-wide.
For more on the issues covered in Dan’s interview, get the full text here. It’s worth a read.
Although many companies are still determining their overall cloud strategy, the question for many is not whether to implement a cloud solution, but when and how it will be carried out. Sid Nag, research director at Gartner, has cited numerous reasons for this uptick in cloud services, noting that many companies are now realizing the benefits of cloud solutions. These include greater agility and scalability, lower costs, and more opportunities for innovation—all of which are factors that fuel business growth and enable companies to keep pace within their industries.
Case in point: Intrigo Systems
One such company benefiting from the cloud is Intrigo Systems, a systems integrator and technology service provider that specializes in implementing extended supply chain solutions for its customers. Started in 2009, the company has undergone tremendous growth within the last eight years. This has resulted in a customer base of nearly 200 organizations, with revenues ranging from US$300 million to $60 billion, which are served by Intrigo’s network of more than 250 consultants worldwide.
While most of its operations are based in North America, Intrigo has been expanding its business. The company has gone from having offices in Fremont, Calif., Houston, and Dallas to establishing a presence worldwide. Today, Intrigo boasts offices not only in multiple United States cities, but in Chennai, Bangalore, Frankfurt, and Heidelberg.
Designing for the future
Intrigo’s remarkable success and growth since its formation led the company to a crossroads. Like many in its position, it needed to start looking ahead and considering what tools would be required to create a foundation that could support its aggressive global scale-up.
“We realized the only way we could do this was to bring automation and the digital transformation to our enterprise,” said Kanth Krishnan, chief customer officer of Intrigo Systems. “Another important consideration for us was how we could accomplish this efficiently, while still providing our customers and network of consultants with top-notch service and resources.”
Intrigo began drilling down into its business processes, identifying the need to eliminate redundancies and enhance its resource management capabilities. By implementing an integrated business solution, the company has been able to better manage its assets. This has improved its ability to oversee its network of consultants and satisfy the various needs of its wide-ranging customer base.
Following this, the company turned to another trouble spot: travel and expenses. As the business grew, Intrigo’s consultants began to expand their reach, traveling worldwide to deliver its services and expertise to customers, many of which operate on an international scale. Intrigo knew it was critical to establish a more organized and scalable means of tracking these items. To accomplish this, it turned to an automated expense-reporting tool, which has helped the organization better manage these processes. While these solutions have helped address specific areas of the business, there was still the issue of how best to manage the company’s operations holistically.
“We felt as if we had hit a wall,” said Krishnan. “We’d grown to such a degree that we felt limited by many of our other systems, which couldn’t deliver the services we needed from a multinational standpoint. We thought about implementing local solutions in each country, but that didn’t really make sense to us. We realized we needed a more inclusive solution and couldn’t think of any company better suited to provide it than SAP.”
Intrigo wanted to create a single platform on which it could operate as a cohesive, global entity. It also wanted the ability to manage its resources and projects more efficiently. The company hoped that a more intuitive and comprehensive system would allow it to gain better visibility into its projects, from where money and resources were spent to how different initiatives translated into outcomes for customers.
Understanding how its projects were operating was just one aspect the company’s journey. Intrigo knew it needed to put in place a system that could not only supply valuable data on how its money and employees were being allocated, but also provide insight that would help the company make proactive changes to improve its operations and lower expenses.
To accomplish these goals, Intrigo implemented a solution equipped with an advanced in-memory platform, which brings contextual analytics, digital assistance capabilities, machine learning, and a well-designed user interface to the public cloud. The solution enables companies to benefit from the latest and most innovative advances in ERP without sizeable up-front capital expenditures or extensive infrastructure changes.
The solution includes a personal assistant, which streamlines many of the tedious tasks associated with enterprise maintenance and offers insights and guidance to drive efficient collaboration and better business decisions.
A new, intelligent ERP core
In just eight weeks, the solution became the intelligent core of Intrigo’s ERP platform. The rapid, fit-to-standard implementation ensured that the company could smoothly transition its operations without any lapse in service or functions. It also integrated tightly with existing solutions to serve as a scalable, long-term platform capable of growing and changing with the company, and help it remain responsive and competitive within its market.
As a result, Intrigo has gained greater visibility into its operations, enhanced its resource management capabilities, reduced revenue leakage, built robust controls and compliance processes, and instituted an intuitive, unified platform to better oversee global business operations from end to end. This is bringing increased transparency and efficiency to every aspect of the business, from billing and timesheets to customer engagement and employee satisfaction.
Dan McCaffrey has an ambitious goal: solving the world’s looming food shortage.
As vice president of data and analytics at The Climate Corporation (Climate), which is a subsidiary of Monsanto, McCaffrey leads a team of data scientists and engineers who are building an information platform that collects massive amounts of agricultural data and applies machine-learning techniques to discover new patterns. These analyses are then used to help farmers optimize their planting.
“By 2050, the world is going to have too many people at the current rate of growth. And with shrinking amounts of farmland, we must find more efficient ways to feed them. So science is needed to help solve these things,” McCaffrey explains. “That’s what excites me.”
“The deeper we can go into providing recommendations on farming practices, the more value we can offer the farmer,” McCaffrey adds.
But to deliver that insight, Climate needs data—and lots of it. That means using remote sensing and other techniques to map every field in the United States and then combining that information with climate data, soil observations, and weather data. Climate’s analysts can then produce a massive data store that they can query for insights.
Meanwhile, precision tractors stream data into Climate’s digital agriculture platform, which farmers can then access from iPads through easy data flow and visualizations. They gain insights that help them optimize their seeding rates, soil health, and fertility applications. The overall goal is to increase crop yields, which in turn boosts a farmer’s margins.
Climate is at the forefront of a push toward deriving valuable business insight from Big Data that isn’t just big, but vast. Companies of all types—from agriculture through transportation and financial services to retail—are tapping into massive repositories of data known as data lakes. They hope to discover correlations that they can exploit to expand product offerings, enhance efficiency, drive profitability, and discover new business models they never knew existed.
The internet democratized access to data and information for billions of people around the world. Ironically, however, access to data within businesses has traditionally been limited to a chosen few—until now. Today’s advances in memory, storage, and data tools make it possible for companies both large and small to cost effectively gather and retain a huge amount of data, both structured (such as data in fields in a spreadsheet or database) and unstructured (such as e-mails or social media posts). They can then allow anyone in the business to access this massive data lake and rapidly gather insights.
It’s not that companies couldn’t do this before; they just couldn’t do it cost effectively and without a lengthy development effort by the IT department. With today’s massive data stores, line-of-business executives can generate queries themselves and quickly churn out results—and they are increasingly doing so in real time. Data lakes have democratized both the access to data and its role in business strategy.
Indeed, data lakes move data from being a tactical tool for implementing a business strategy to being a foundation for developing that strategy through a scientific-style model of experimental thinking, queries, and correlations. In the past, companies’ curiosity was limited by the expense of storing data for the long term. Now companies can keep data for as long as it’s needed. And that means companies can continue to ask important questions as they arise, enabling them to future-proof their strategies.
Climate’s McCaffrey has many questions to answer on behalf of farmers. Climate provides several types of analytics to farmers including descriptive services, which are metrics about the farm and its operations, and predictive services related to weather and soil fertility. But eventually the company hopes to provide prescriptive services, helping farmers address all the many decisions they make each year to achieve the best outcome at the end of the season. Data lakes will provide the answers that enable Climate to follow through on its strategy.
Behind the scenes at Climate is a deep-science data lake that provides insights, such as predicting the fertility of a plot of land by combining many data sets to create accurate models. These models allow Climate to give farmers customized recommendations based on how their farm is performing.
“Machine learning really starts to work when you have the breadth of data sets from tillage to soil to weather, planting, harvest, and pesticide spray,” McCaffrey says. “The more data sets we can bring in, the better machine learning works.”
The deep-science infrastructure already has terabytes of data but is poised for significant growth as it handles a flood of measurements from field-based sensors.
“That’s really scaling up now, and that’s what’s also giving us an advantage in our ability to really personalize our advice to farmers at a deeper level because of the information we’re getting from sensor data,” McCaffrey says. “As we roll that out, our scale is going to increase by several magnitudes.”
Also on the horizon is more real-time data analytics. Currently, Climate receives real-time data from its application that streams data from the tractor’s cab, but most of its analytics applications are run nightly or even seasonally.
In August 2016, Climate expanded its platform to third-party developers so other innovators can also contribute data, such as drone-captured data or imagery, to the deep-science lake.
“That helps us in a lot of ways, in that we can get more data to help the grower,” McCaffrey says. “It’s the machine learning that allows us to find the insights in all of the data. Machine learning allows us to take mathematical shortcuts as long as you’ve got enough data and enough breadth of data.”
Growth is essential for U.S. railroads, which reinvest a significant portion of their revenues in maintenance and improvements to their track systems, locomotives, rail cars, terminals, and technology. With an eye on growing its business while also keeping its costs down, CSX, a transportation company based in Jacksonville, Florida, is adopting a strategy to make its freight trains more reliable.
In the past, CSX maintained its fleet of locomotives through regularly scheduled maintenance activities, which prevent failures in most locomotives as they transport freight from shipper to receiver. To achieve even higher reliability, CSX is tapping into a data lake to power predictive analytics applications that will improve maintenance activities and prevent more failures from occurring.
Beyond improving customer satisfaction and raising revenue, CSX’s new strategy also has major cost implications. Trains are expensive assets, and it’s critical for railroads to drive up utilization, limit unplanned downtime, and prevent catastrophic failures to keep the costs of those assets down.
That’s why CSX is putting all the data related to the performance and maintenance of its locomotives into a massive data store.
“We are then applying predictive analytics—or, more specifically, machine-learning algorithms—on top of that information that we are collecting to look for failure signatures that can be used to predict failures and prescribe maintenance activities,” says Michael Hendrix, technical director for analytics at CSX. “We’re really looking to better manage our fleet and the maintenance activities that go into that so we can run a more efficient network and utilize our assets more effectively.”
“In the past we would have to buy a special storage device to store large quantities of data, and we’d have to determine cost benefits to see if it was worth it,” says Donna Crutchfield, assistant vice president of information architecture and strategy at CSX. “So we were either letting the data die naturally, or we were only storing the data that was determined to be the most important at the time. But today, with the new technologies like data lakes, we’re able to store and utilize more of this data.”
CSX can now combine many different data types, such as sensor data from across the rail network and other systems that measure movement of its cars, and it can look for correlations across information that wasn’t previously analyzed together.
One of the larger data sets that CSX is capturing comprises the findings of its “wheel health detectors” across the network. These devices capture different signals about the bearings in the wheels, as well as the health of the wheels in terms of impact, sound, and heat.
“That volume of data is pretty significant, and what we would typically do is just look for signals that told us whether the wheel was bad and if we needed to set the car aside for repair. We would only keep the raw data for 10 days because of the volume and then purge everything but the alerts,” Hendrix says.
With its data lake, CSX can keep the wheel data for as long as it likes. “Now we’re starting to capture that data on a daily basis so we can start applying more machine-learning algorithms and predictive models across a larger history,” Hendrix says. “By having the full data set, we can better look for trends and patterns that will tell us if something is going to fail.”
Another key ingredient in CSX’s data set is locomotive oil. By analyzing oil samples, CSX is developing better predictions of locomotive failure. “We’ve been able to determine when a locomotive would fail and predict it far enough in advance so we could send it down for maintenance and prevent it from failing while in use,” Crutchfield says.
“Between the locomotives, the tracks, and the freight cars, we will be looking at various ways to predict those failures and prevent them so we can improve our asset allocation. Then we won’t need as many assets,” she explains. “It’s like an airport. If a plane has a failure and it’s due to connect at another airport, all the passengers have to be reassigned. A failure affects the system like dominoes. It’s a similar case with a railroad. Any failure along the road affects our operations. Fewer failures mean more asset utilization. The more optimized the network is, the better we can service the customer.”
Detecting Fraud Through Correlations
Traditionally, business strategy has been a very conscious practice, presumed to emanate mainly from the minds of experienced executives, daring entrepreneurs, or high-priced consultants. But data lakes take strategy out of that rarefied realm and put it in the environment where just about everything in business seems to be going these days: math—specifically, the correlations that emerge from applying a mathematical algorithm to huge masses of data.
The Financial Industry Regulatory Authority (FINRA), a nonprofit group that regulates broker behavior in the United States, used to rely on the experience of its employees to come up with strategies for combating fraud and insider trading. It still does that, but now FINRA has added a data lake to find patterns that a human might never see.
Overall, FINRA processes over five petabytes of transaction data from multiple sources every day. By switching from traditional database and storage technology to a data lake, FINRA was able to set up a self-service process that allows analysts to query data themselves without involving the IT department; search times dropped from several hours to 90 seconds.
While traditional databases were good at defining relationships with data, such as tracking all the transactions from a particular customer, the new data lake configurations help users identify relationships that they didn’t know existed.
Leveraging its data lake, FINRA creates an environment for curiosity, empowering its data experts to search for suspicious patterns of fraud, marketing manipulation, and compliance. As a result, FINRA was able to hand out 373 fines totaling US$134.4 million in 2016, a new record for the agency, according to Law360.
Data Lakes Don’t End Complexity for IT
Though data lakes make access to data and analysis easier for the business, they don’t necessarily make the CIO’s life a bed of roses. Implementations can be complex, and companies rarely want to walk away from investments they’ve already made in data analysis technologies, such as data warehouses.
“There have been so many millions of dollars going to data warehousing over the last two decades. The idea that you’re just going to move it all into a data lake isn’t going to happen,” says Mike Ferguson, managing director of Intelligent Business Strategies, a UK analyst firm. “It’s just not compelling enough of a business case.” But Ferguson does see data lake efficiencies freeing up the capacity of data warehouses to enable more query, reporting, and analysis.
How to Avoid Drowning in the Lake
The benefits of data lakes can be squandered if you don’t manage the implementation and data ownership carefully.
Deploying and managing a massive data store is a big challenge. Here’s how to address some of the most common issues that companies face:
Determine the ROI. Developing a data lake is not a trivial undertaking. You need a good business case, and you need a measurable ROI. Most importantly, you need initial questions that can be answered by the data, which will prove its value.
Find data owners. As devices with sensors proliferate across the organization, the issue of data ownership becomes more important.
Have a plan for data retention. Companies used to have to cull data because it was too expensive to store. Now companies can become data hoarders. How long do you store it? Do you keep it forever?
Manage descriptive data. Software that allows you to tag all the data in one or multiple data lakes and keep it up-to-date is not mature yet. We still need tools to bring the metadata together to support self-service and to automate metadata to speed up the preparation, integration, and analysis of data.
Develop data curation skills. There is a huge skills gap for data repository development. But many people will jump at the chance to learn these new skills if companies are willing to pay for training and certification.
Be agile enough to take advantage of the findings. It used to be that you put in a request to the IT department for data and had to wait six months for an answer. Now, you get the answer immediately. Companies must be agile to take advantage of the insights.
Secure the data. Besides the perennial issues of hacking and breaches, a lot of data lakes software is open source and less secure than typical enterprise-class software.
Measure the quality of data. Different users can work with varying levels of quality in their data. For example, data scientists working with a huge number of data points might not need completely accurate data, because they can use machine learning to cluster data or discard outlying data as needed. However, a financial analyst might need the data to be completely correct.
Avoid creating new silos. Data lakes should work with existing data architectures, such as data warehouses and data marts.
From Data Queries to New Business Models
The ability of data lakes to uncover previously hidden data correlations can massively impact any part of the business. For example, in the past, a large soft drink maker used to stock its vending machines based on local bottlers’ and delivery people’s experience and gut instincts. Today, using vast amounts of data collected from sensors in the vending machines, the company can essentially treat each machine like a retail store, optimizing the drink selection by time of day, location, and other factors. Doing this kind of predictive analysis was possible before data lakes came along, but it wasn’t practical or economical at the individual machine level because the amount of data required for accurate predictions was simply too large.
The next step is for companies to use the insights gathered from their massive data stores not just to become more efficient and profitable in their existing lines of business but also to actually change their business models.
For example, product companies could shield themselves from the harsh light of comparison shopping by offering the use of their products as a service, with sensors on those products sending the company a constant stream of data about when they need to be repaired or replaced. Customers are spared the hassle of dealing with worn-out products, and companies are protected from competition as long as customers receive the features, price, and the level of service they expect. Further, companies can continuously gather and analyze data about customers’ usage patterns and equipment performance to find ways to lower costs and develop new services.
Data for All
Given the tremendous amount of hype that has surrounded Big Data for years now, it’s tempting to dismiss data lakes as a small step forward in an already familiar technology realm. But it’s not the technology that matters as much as what it enables organizations to do. By making data available to anyone who needs it, for as long as they need it, data lakes are a powerful lever for innovation and disruption across industries.
“Companies that do not actively invest in data lakes will truly be left behind,” says Anita Raj, principal growth hacker at DataRPM, which sells predictive maintenance applications to manufacturers that want to take advantage of these massive data stores. “So it’s just the option of disrupt or be disrupted.” D!
The Digitalist Magazine is your online destination for everything you need to know to lead your enterprise’s digital transformation.
Read the Digitalist Magazine and get the latest insights about the digital economy that you can capitalize on today.
About Timo Elliott
Timo Elliott is an Innovation Evangelist for SAP and a passionate advocate of innovation, digital business, analytics, and artificial intelligence. He was the eighth employee of BusinessObjects and for the last 25 years he has worked closely with SAP customers around the world on new technology directions and their impact on real-world organizations. His articles have appeared in articles such as Harvard Business Review, Forbes, ZDNet, The Guardian, and Digitalist Magazine. He has worked in the UK, Hong Kong, New Zealand, and Silicon Valley, and currently lives in Paris, France. He has a degree in Econometrics and a patent in mobile analytics.
Oil prices have fallen dramatically over last few years, forcing some major oil companies to take drastic actions such as layoffs, cutting investments and budgets, and more. Shell, for example, shelved its plan to invest in Qatar, Aramco put on hold its deep-water exploration in the Red Sea, Schlumberger fired a few thousand employees, and the list goes on…
In view of falling oil prices and the resulting squeeze on cash flows, the oil and gas industry has been challenged to adapt and optimize its performance to remain profitable while maintaining a long-term investment and operating outlook. Currently, oil and gas companies find it difficult to maintain the same level of investment in exploration and production as when crude prices were at their peak. Operations in the oil and gas industry today means balancing a dizzying array of trade-offs in the drive for competitive advantage while maximizing return on investment.
The result is a dire need to optimize performance and optimize the cost of production per barrel. Companies have many optimization opportunities once they start using the massive data being generated by oil fields. Oil and gas companies can turn this crisis into an opportunity by leveraging technological innovations like artificial intelligence to build a foundation for long-term success. If volatility in oil prices is the new norm, the push for “value over volume” is the key to success going forward.
Using AI tools, upstream oil and gas companies can shift their approach from production at all costs to producing in context. They will need to do profit and loss management at the well level to optimize the production cost per barrel. To do this, they must integrate all aspects of production management, collect the data for analysis and forecasting, and leverage artificial intelligence to optimize operations.
When remote sensors are connected to wireless networks, data can be collected and centrally analyzed from any location. According to the consulting firm McKinsey, the oil and gas supply chain stands to gain $50 billion in savings and increased profit by adopting AI. As an example, using AI algorithms to more accurately sift through signals and noise in seismic data can decrease dry wellhead development by 10 percent.
How oil and gas can leverage artificial intelligence
1. Planning and forecasting
On a macro scale, deep machine learning can help increase awareness of macroeconomic trends to drive investment decisions in exploration and production. Economic conditions and even weather patterns can be considered to determine where investments should take place as well as intensity of production.
2. Eliminate costly risks in drilling
Drilling is an expensive and risky investment, and applying AI in the operational planning and execution stages can significantly improve well planning, real-time drilling optimization, frictional drag estimation, and well cleaning predictions. Additionally, geoscientists can better assess variables such as the rate of penetration (ROP) improvement, well integrity, operational troubleshooting, drilling equipment condition recognition, real-time drilling risk recognition, and operational decision-making.
When drilling, machine-learning software takes into consideration a plethora of factors, such as seismic vibrations, thermal gradients, and strata permeability, along with more traditional data such as pressure differentials. AI can help optimize drilling operations by driving decisions such as direction and speed in real time, and it can predict failure of equipment such as semi-submersible pumps (ESPs) to reduce unplanned downtime and equipment costs.
3. Well reservoir facility management
Wells, reservoirs, and facility management includes integration of multiple disciplines: reservoir engineering, geology, production technology, petro physics, operations, and seismic interpretation. AI can help to create tools that allow asset teams to build professional understanding and identify opportunities to improve operational performance.
AI techniques can also be applied in other activities such as reservoir characterization, modeling and field surveillance. Fuzzy logic, artificial neural networks and expert systems are used extensively across the industry to accurately characterize reservoirs in order to attain optimum production level.
Today, AI systems form the backbone of digital oil field (DOF) concepts and implementations. However, there is still great potential for new ways to optimize field development and production costs, prolong field life, and increase the recovery factor.
4. Predictive maintenance
Today, artificial intelligence is taking the industry by storm. AI-powered software and sensor hardware enables us to use very large amounts of data to gain real-time responses on the best future course of action. With predictive analytics and cognitive security, for example, oil and gas companies can operate equipment safely and securely while receiving recommendations on how to avoid future equipment failure or mediate potential security breaches.
5. Oil and gas well surveying and inspections
Drones have been part of the oil and gas industry since 2013, when ConocoPhillips used the Boeing ScanEagle drone in trials in the Chukchi Sea. In June 2014, the Federal Aviation Administration (FAA) issued the first commercial permit for drone use over United States soil to BP, allowing the company to survey pipelines, roads, and equipment in Prudhoe Bay, Alaska. In January, Sky-Futures completed the first drone inspection in the Gulf of Mexico.
While drones are primarily used in the midstream sector, they can be applied to almost every aspect of the industry, including land surveying and mapping, well and pipeline inspections, and security. Technology is being developed to enable drones to detect early methane leaks. In addition, one day, drones could be used to find oil and gas reservoirs underlying remote uninhabited regions, from the comfort of a warm office.
6. Remote logistics
As logistics to offshore locations is always a challenge, AI-enhanced drones can be used to deliver materials to remote offshore locations.
Current adoption of AI
Chevron is currently using AI to identify new well locations and simulation candidates in California. By using AI software to analyze the company’s large collection of historical well performance data, the company is drilling in better locations and has seen production rise 30% over conventional methods. Chevron is also using predictive models to analyze the performance of thousands of pieces of rotating equipment to detect failures before they occur. By addressing problems before they become critical, Chevron has avoided unplanned shutdowns and lowered repair expenses. Increased production and lower costs have translated to more profit per well.
Today’s oil and gas industry has been transformed by two industry downturns in one decade. Although adoption of new hard technology such as directional drilling and hydraulic fracturing (fracking) has helped, the oil and gas industry needs to continue to innovate in today’s low-price market to survive. AI has the potential to differentiate companies that thrive and those that are left behind.
The promise of AI is already being realized in the oil and gas industry. Early adopters are taking advantage of their position to get a head start on the competition and protect their assets. The industry has always leveraged technology to adapt to change, and early adopters have always benefited the most. As competition in the oil and gas industry continues to heat up, companies cannot afford to be left behind. For those that understand and seize the opportunities inherent in adopting cognitive technologies, the future looks bright.
The Digitalist Magazine is your online destination for everything you need to know to lead your enterprise’s digital transformation.
Read the Digitalist Magazine and get the latest insights about the digital economy that you can capitalize on today.
About Anoop Srivastava
Anoop Srivastava is Senior Director of the Energy and Natural Resources Industries at SAP Value Engineering in Middle East and North Africa. He advises clients on their digital transformation strategies and helps them align their business strategy with IT strategy leveraging digital technology innovations such as the Internet of Things, Big Data, Advanced Analytics, Cloud etc. He has 21+ years of work experience spanning across Oil& Gas Industry, Business Consulting, Industry Value Advisory and Digital Transformation.