Sections

IT Trends That Matter For 2016

Hu Yoshida

Each year, analysts predict some of the upcoming trends in the technology industry. Here is a look at some of the IT trends that matter for 2016, according to Hu Yoshida, chief technology officer at Hitachi Data Systems (HDS).

A greater focus on application and analytics

1. IT skills undergo transformation

To meet the challenges of IT transformation, IT must offload the grunt work that ties its staff to infrastructure management and operations and start to develop specialist skills in areas such as cloud enablement, analytics, DevOps, mobile, and business solutions. This transformation of IT skills will involve a change in culture and will require the commitment of both business and IT leaders.

2. DevOps adoption accelerates application delivery

DevOps is a software development methodology where operations and development engineers work together throughout the application cycle, resulting in high IT performance. Companies with high IT performance are twice as likely to exceed their profitability, market share, and productivity goals.

3. Data warehouses transition into data lakes

Big Data analytics involves the processing of large amounts of heterogeneous data derived from multiple sources and across multiple knowledge domains. Data lakes enable this by bringing together data sources in their original state which can then be analyzed by applications that are brought to the data. They must also be able to incorporate existing data warehouses to leverage the investments that have already been made.

4. IT takes control of provisioning analytics platforms

Business leaders will look to IT to make investments in analytics platforms, acknowledging the fact that IT has a better understanding of security, data privacy, integration, and the service level requirements of the business. This will reverse the shadow IT trend of business units acquiring their own analytics platforms and tools and creating their own data silos.

Infrastructure technologies drive efficiencies

5. Converged solutions replace reference architectures

Instead of providing reference architectures detailing best practices for application enablement, vendors will begin to deliver these best practices as templates implemented through converged solutions. The converged infrastructure offers a more evolved platform for deriving greater cost efficiencies and time savings by allowing IT resources to be managed more cohesively.

6. In-memory databases gain traction

The move to in-memory databases will gather momentum as faster reporting and analysis deliver a clear competitive advantage in today’s real-time business environment. Developments such as the consolidation of SAP’s business suite onto the HANA in-memory database with S/4 HANA, and the emergence of converged solutions and cloud service providers, will help simplify IT and facilitate this migration.

7. Flash devices begin to replace high-performance disks

The availability of multi-terabyte flash devices will enable flash to compete with high-performance 15K RPM disk drives on a capacity-cost basis. As a result, the majority of storage systems delivered in 2016 will contain a percentage of flash to boost response times and reduce the cost of managing storage performance.

IT leadership drives innovation

8. Businesses prepare for next-gen cloud

According to a study by The Economist, some of the best practices that will help business leaders make the most of their cloud opportunities include improving supplier selection; choosing the right cloud service for the right task; making better use of integrators to connect cloud services to existing IT infrastructure; and considering factors such as cloud’s potential to improve business operations and boost employee efficiency.

9. IT infrastructure companies will be disrupted

As IT begins to focus more on application delivery, analytics, and the Internet of Things, pure-play infrastructure companies will try to cope with declining revenues by splitting off some parts of their business, acquiring new infrastructure companies, or merging with other infrastructure companies to drive economies of scale. However, in the longer term, they will have to be able to integrate IT with operational technology to deliver solutions around the Internet of Things that matter, in areas such as public safety, transportation, health, and life sciences.

10. IT plays leadership role in the 3rd Platform

IT will play a more proactive role in leading businesses through the transformation driven by social, mobile, analytics, and cloud, collectively known as the 3rd Platform. Contrary to the view that IT no longer plays a dominant role in driving enterprise technology spending, we believe that the compelling value of IT lies in its ability to implement 3rd Platform technologies in accordance with corporate requirements for security, data protection, availability, and collaboration. If IT does not step up to this leadership role, the result will be silos of information and duplication of processes that will inhibit business growth.

Please view the webinar discussing the top 10 IT trends that I see for 2016. This piece features insights from Greg Knieriemen, our technical evangelist, and Adrian Deluca, our Asia Pacific CTO. Greg and Adrian added their own perspectives on these trends. I would also like to hear your views. As you will see, I am expecting a major transformation to happen in IT and in the vendor community.

For a more in-depth look at the multiple factors driving digital transformation, download the SAP eBook, Digital Disruption: How Digital Technology is Transforming Our World.

The article originally appeared on Hitachi Data Systems Community and is republished with the author’s permission.

Comments

Hu Yoshida

About Hu Yoshida

Hu Yoshida is responsible for defining the technical direction of Hitachi Data Systems. Currently, he leads the company's effort to help customers address data life cycle requirements and resolve compliance, governance and operational risk issues. He was instrumental in evangelizing the unique Hitachi approach to storage virtualization, which leveraged existing storage services within Hitachi Universal Storage Platform® and extended it to externally-attached, heterogeneous storage systems. Yoshida is well-known within the storage industry, and his blog has ranked among the "top 10 most influential" within the storage industry as evaluated by Network World. In October of 2006, Byte and Switch named him one of Storage Networking’s Heaviest Hitters and in 2013 he was named one of the "Ten Most Impactful Tech Leaders" by Information Week.

Smart Machines Create Markets For Cyber-Physical Advances

Marion Heindenreich

Today, industrial machines are more intelligent than ever before. These intelligent machines are changing companies in many ways.

Why smart machines?

Mobile networked computers were a key breakthrough for making smart machines. Big Data allows machines and computers to store information and analyze complex patterns. Cloud computing offers broad access to information and more storage.

These computerized machines are both physical and virtual. Some call them “cyber-physical” machines. Technology lets them be self-aware and connected to each other and larger systems.

Businesses change their approaches

Intelligent machines allow companies to innovate in many areas. For one, the value proposition for customers is evolving. Businesses now model and plan in different ways in many industries.

Makers of industrial machines and parts work in new ways within the organization. Engineering now partners with mechanical, electronic, and software staff to develop new products. Manufacturing now seamlessly ties what happens on the shop floor to the customer.

Service models are changing too. Scheduled and reactionary servicing of machines is fading. Now intelligent machines track themselves. Machines detect problems and report them automatically. Major problems or failures are predicted and reported.

A data mining example

One good industrial example is mining, which can be dangerous and difficult. As ores become scarce, the costs of mining have increased.

“Smart machines” started in mining in the late 1990s. Software and hardware let remote users change settings. Operators moved hydraulic levers from a safe distance. Sensors observed performance and diagnosed issues.

Data cables connected machines to computers on the surface. Continuous and remote monitoring of the machines grew. Over time, embedded sensors helped improve monitoring, diagnostics, and data storage.

The technology means workers only go underground to fix specific issues. As a result, accident and injury risk is lower.

New wireless technology now lets mining companies connect data from many mine sites. Service centers access large amounts of data and can improve performance. Maintenance is prioritized and equipment downtime is reduced.

Opportunity abounds

For companies the time is now. Today, mobile “connected things” generate 17% of the digital universe. By 2020 that share grows to 27%.

You might not be investing in this so-called “Internet of Things” (devices that connect to each other). But it’s a good bet your competitors are. A December 2015 study reported 33% of industrial companies are investing in the Internet of Things. Another 25% are considering it.

There are risks

This new dawning era of manufacturing is exciting. But there are concerns. Cyber attacks on the Internet of Things are not new. But as the use of intelligent machines grows, the threat of cyber attacks in industry grows.

Data confidentiality and privacy are concerns. So too are software and hardware vulnerabilities. Exposure to attack lies not just in the virtual space but the physical too. Tampering with unattended machines and theft pose serious risk.

To address these threats, industries must invest in cybersecurity along with smart machines.

Conclusion

The potential advantages of smart machines are staggering. They can reshape industries and change how companies produce new products and create new markets.

For more information, please download the white paper Digital Manufacturing: Powering the Fourth Industrial Revolution.

Comments

Marion Heindenreich

About Marion Heindenreich

Marion Heidenreich is a solution manager for the SAP Industrial Machinery and Components Business Unit who focuses on solution innovations like Product Costing on SAP HANA and cloud solutions, as well as providing financial and business analysis for industry business strategy definition and business planning.

Mining Firms Turn To Tech

Ruediger Schroedter

Gone are the days in mining when assessments of potential dig sites meant lots of waiting for results. Gone, too, is the uncertainty on a mine job about where to go next.

For mining executives, recent advances in digital technology allow companies to make decisions at a rapid pace. Decisions that used to take days and weeks now can be done in minutes and hours.

With more information available faster, mining leaders reduce both short- and long-term financial risk. Data from across the enterprise inform decisions about buying and selling assets. Profitability should increase, driven by key technology advances.

Digging in to the data

There are two key drivers to this digital revolution. The first is the rise of the Internet of Things (IoT). The IoT consists of devices that are equipped with sensors, software, and wireless capabilities. These devices are connected to each other and can detect, store, and send data.

Bonus: Click here to learn more about Digital Transformation in Mining.

The second is the rise of Big Data, mobile, and cloud computing. Today’s mobile devices can track, send, and receive data from remote sites worldwide. Cloud computing stores billions of bytes of data at low cost. Big Data analytics programs take data coming from many different locations and systems and synthesize it. Those programs then better inform decisions by offering dashboards, metrics, and predictive modeling.

Robots are able to venture into hazardous areas and move material with remote human oversight. On-site mining data is sent via mobile phone to a cloud-based platform. For mining, the convergence of these technologies provides extraordinary possibilities.

Technology at play

The potential impact is significant. A recent report by McKinsey & Co. showed the use of advanced analytics in mining and related industries had a major impact. Firms using these programs to assess production areas increased their profit margins by 2-3 percentage points.

One mining company used so-called Monte Carlo simulations to reduce certain capital expenses. Monte Carlo simulations use complex algorithms and repeated random sampling to model possible outcomes. They’re frequently used in finance, biology, and insurance. The Mining Journal reported how the company challenged assumptions about a project’s capital needs. It took historical data on certain disruptions such as rainfall patterns. Then models of its mines were made showing the impact of flooding and rainwater. The data led to a new strategy that maximized storage capacity and handling across all its mines. Capital costs dropped by 20 percent.

18 Aug 2012, South Dakota, USA --- USA, South Dakota, Lead, View of open pit --- Image by © Bryan Mullennix/Tetra Images/Corbis

Buy or sell?

With so many variables at play, mining valuation is not for the faint of heart. Integrated data streams available at the discovery stage make for better informed purchase decisions.

Software programs today can take data to build and validate exploration models. These programs use 3D visualization and validated geophysical, analytical, and drill hole data. In turn, detailed 3D topographical models are possible.

Other programs assess historical, assay, and drilling data. This information creates viable scenarios for determining whether to buy or sell a site.

These tools use data consistently from one potential site to the next, allowing for forecasting of economic risk that is consistent across the organization. The firm today can use “real options valuation” to develop models of outcomes given changing economic conditions. With clearer information about potential risks, firms can decide whether to stage, sell, abandon, expand, or buy.

Anticipating, not reacting

Mining companies realize today that these analytic platforms and dashboards offer many advantages. Users have a clearer interpretation of the aggregated and analyzed data points from multiple areas. Using predictive analytics, mining decisions are made based on smart assumptions, not past historical information.

Robust software programs can generate reports almost instantaneously. Supervisors have on-site access to the analysis through a web browser or app. This data has many uses. Drilling managers save time and can make quicker decisions on next moves. Supplies can be ordered faster. Needed data for accreditation and compliance is immediately accessible.

Selecting the right sites

One example is assay analysis. Today, geologists do not wait weeks or months for assay results. Instead of off-site analysis, web-based applications deliver information much faster to inform decisions.

Robots are sending information about field operations, safety, needed maintenance, and drilling performance.  Some devices send the information themselves. In other cases, staff use mobile phones, tablets, or laptops.  This information and analytics in turn help with site selection. Integrating data from mine planning, ventilation, safety, rock engineering, and mineral resources improves overall forecasting.

Discovery, particularly of Tier 1 sites, is an increasingly costly venture for mining companies. Demand for many products is increasing while discovery rates are dropping. Mined product is of a lesser quality, particularly in mature mining locations. Many possible sites are in areas that are underexplored areas with difficult and deep cover.

The advanced technologies available today are contributing to rapid improvement in these discovery issues.

Prospective drilling

Consider the drill hole. To reduce costs in exploration, there needs to be enough rich information from the opening drill hole. It needs to be delivered in as close to real time as possible. Doing so lessens the risk of the second drill hole. Better information from the start helps improve vectoring. It provides better information about what mineral systems are being drilled.

This approach, called prospective drilling, is becoming increasingly used in mining. It employs drilling activity to map covered mineral systems. In turn, geochemical and geophysical vectoring can lead firms toward deposits.

Australia has invested heavily in this area. The Deep Exploration Technologies Cooperative Research Centre (DET CRC) has a singular vision: uncovering the future. Its core purpose is “develop transformational technologies for successful mineral exploration through deep, barren cover rocks.”

To get to that point, the DET CRC is borrowing a drilling technique from the oil business. Coiled tubing is paired with downhole and top-of-the-hole sensors. The informaton provides petrophysical, structural, rock fabric, geochemical, and mineralogical data all at once.

Conclusion

To meet increasing demands for new viable sites, and to improve efficient on sites, mining is changing. Using smart, connected products and robust data modeling, mining is being done faster, safer, and more efficiently than ever.

Join a LiveTwitterChat on digitalization in mining on May 4th from 10-11 a.m. EST: #digitalmining

The global mining and metals industry will come together to discuss how digital innovation is impacting the mining industry July 12-14 at the International SAP Conference for Mining and Metals in Frankfurt, Germany.  Don’t miss this opportunity to meet with world leaders and learn how your organization can become a connected digital enterprise.

Follow speakers and pre-event activities by following sapmmconf and @sapmillmining on Twitter

AA Mining and Metals Forum

Comments

Ruediger Schroedter

About Ruediger Schroedter

Ruediger Schroedter is responsible for solution management of SAP solutions for the mining industry worldwide. He has spent more than 15 years in the mill products and mining industries and has extensive experience implementing SAP solutions for customers in these industries before coming to SAP.

Unlock Your Digital Super Powers: How Digitization Helps Companies Be Live Businesses

Erik Marcade and Fawn Fitter

The Port of Hamburg handles 9 million cargo containers a year, making it one of the world’s busiest container ports. According to the Hamburg Port Authority (HPA), that volume doubled in the last decade, and it’s expected to at least double again in the next decade—but there’s no room to build new roads in the center of Hamburg, one of Germany’s historic cities. The port needed a way to move more freight more efficiently with the physical infrastructure it already has.

sap_Q216_digital_double_feature1_images1The answer, according to an article on ZDNet, was to digitize the processes of managing traffic into, within, and back out of the port. By deploying a combination of sensors, telematics systems, smart algorithms, and cloud data processing, the Port of Hamburg now collects and analyzes a vast amount of data about ship arrivals and delays, parking availability, ground traffic, active roadwork, and more. It generates a continuously updated model of current port conditions, then pushes the results through mobile apps to truck drivers, letting them know exactly when ships are ready to drop off or receive containers and optimizing their routes. According to the HPA, they are now on track to handle 25 million cargo containers a year by 2025 without further congestion or construction, helping shipping companies bring more goods and raw materials in less time to businesses and consumers all across Europe.

In the past, the port could only have solved its problem with backhoes and building permits—which, given the physical constraints, means the problem would have been unsolvable. Today, though, software and sensors are allowing it to improve processes and operations to a previously impossible extent. Big Data analysis, data mining, machine learning, artificial intelligence (AI), and other technologies have finally become sophisticated enough to identify patterns not just in terabytes but in petabytes of data, make decisions accordingly, and learn from the results, all in seconds. These technologies make it possible to digitize all kinds of business processes, helping organizations become more responsive to changing market conditions and more able to customize interactions to individual customer needs. Digitization also streamlines and automates these processes, freeing employees to focus on tasks that require a human touch, like developing innovative strategies or navigating office politics.

In short, digitizing business processes is key to ensuring that the business can deliver relevant, personalized responses to the market in real time. And that, in turn, is the foundation of the Live Business—a business able to coordinate multiple functions in order to respond to and even anticipate customer demand at any moment.

Some industries and organizations are on the verge of discovering how business process digitization can help them go live. Others have already started putting it into action: fine-tuning operations to an unprecedented level across departments and at every point in the supply chain, cutting costs while turbocharging productivity, and spotting trends and making decisions at speeds that can only be called superhuman.

Balancing Insight and Action

sap_Q216_digital_double_feature1_images2Two kinds of algorithms drive process digitization, says Chandran Saravana, senior director of advanced analytics at SAP. Edge algorithms operate at the point where customers or other end users interact directly with a sensor, application, or Internet-enabled device. These algorithms, such as speech or image recognition, focus on simplicity and accuracy. They make decisions based primarily on their ability to interpret input with precision and then deliver a result in real time.

Edge algorithms work in tandem with, and sometimes mature into, server-level algorithms, which report on both the results of data analysis and the analytical process itself. For example, the complex systems that generate credit scores assess how creditworthy an individual is, but they also explain to both the lender and the credit applicant why a score is low or high, what factors went into calculating it, and what an applicant can do to raise the score in the future. These server-based algorithms gather data from edge algorithms, learn from their own results, and become more accurate through continuous feedback. The business can then track the results over time to understand how well the digitized process is performing and how to improve it.

sap_Q216_digital_double_feature1_images5From Data Scarcity to a Glut

To operate in real time, businesses need an accurate data model that compares what’s already known about a situation to what’s happened in similar situations in the past to reach a lightning-fast conclusion about what’s most likely to happen next. The greatest barrier to this level of responsiveness used to be a lack of data, but the exponential growth of data volumes in the last decade has flipped this problem on its head. Today, the big challenge for companies is having too much data and not enough time or power to process it, says Saravana.

Even the smartest human is incapable of gathering all the data about a given situation, never mind considering all the possible outcomes. Nor can a human mind reach conclusions at the speed necessary to drive Live Business. On the other hand, carefully crafted algorithms can process terabytes or even petabytes of data, analyze patterns and detect outliers, arrive at a decision in seconds or less—and even learn from their mistakes (see How to Train Your Algorithm).

How to Train Your Algorithm 

The data that feeds process digitization can’t just simmer.
It needs constant stirring.

Successfully digitizing a business process requires you to build a model of the business process based on existing data. For example, a bank creates a customer record that includes not just the customer’s name, address, and date of birth but also the amount and date of the first deposit, the type of account, and so forth. Over time, as the customer develops a history with the bank and the bank introduces new products and services, customer records expand to include more data. Predictive analytics can then extrapolate from these records to reach conclusions about new customers, such as calculating the likelihood that someone who just opened a money market account with a large balance will apply for a mortgage in the next year.

Germany --- Germany, Lower Bavaria, Man training English Springer Spaniel in grass field --- Image by © Roman M‰rzinger/Westend61/CorbisTo keep data models accurate, you have to have enough data to ensure that your models are complete—that is, that they account for every possible predictable outcome. The model also has to push outlying data and exceptions, which create unpredictable outcomes, to human beings who can address their special circumstances. For example, an algorithm may be able to determine that a delivery will fail to show up as scheduled and can point to the most likely reasons why, but it can only do that based on the data it can access. It may take a human to start the process of locating the misdirected shipment, expediting a replacement, and establishing what went wrong by using business knowledge not yet included in the data model.

Indeed, data models need to be monitored for relevance. Whenever the results of a predictive model start to drift significantly from expectations, it’s time to examine the model to determine whether you need to dump old data that no longer reflects your customer base, add a new product or subtract a defunct one, or include a new variable, such as marital status or length of customer relationship that further refines your results.

It’s also important to remember that data doesn’t need to be perfect—and, in fact, probably shouldn’t be, no matter what you might have heard about the difficulty of starting predictive analytics with lower-quality data. To train an optical character recognition system to recognize and read handwriting in real time, for example, your samples of block printing and cursive writing data stores also have to include a few sloppy scrawls so the system can learn to decode them.

On the other hand, in a fast-changing marketplace, all the products and services in your database need consistent and unchanging references, even though outside the database, names, SKUs, and other identifiers for a single item may vary from one month or one order to the next. Without consistency, your business process model won’t be accurate, nor will the results.

Finally, when you’re using algorithms to generate recommendations to drive your business process, the process needs to include opportunities to test new messages and products against existing successful ones as well as against random offerings, Saravana says. Otherwise, instead of responding to your customers’ needs, your automated system will actually control their choices by presenting them with only a limited group of options drawn from those that have already received the most
positive results.

Any process is only as good as it’s been designed to be. Digitizing business processes doesn’t eliminate the possibility of mistakes and problems; but it does ensure that the mistakes and problems that arise are easy to spot and fix.

From Waste to Gold

Organizations moving to digitize and streamline core processes are even discovering new business opportunities and building new digitized models around them. That’s what happened at Hopper, an airfare prediction app firm in Cambridge, Massachusetts, which discovered in 2013 that it could mine its archives of billions of itineraries to spot historical trends in airfare pricing—data that was previously considered “waste product,” according to Hopper’s chief data scientist, Patrick Surry.

Hopper developed AI algorithms to correlate those past trends with current fares and to predict whether and when the price of any given flight was likely to rise or fall. The results were so accurate that Hopper jettisoned its previous business model. “We check up to 3 billion itineraries live, in real time, each day, then compare them to the last three to four years of historical airfare data,” Surry says. “When consumers ask our smartphone app whether they should buy now or wait, we can tell them, ‘yes, that’s a good deal, buy it now,’ or ‘no, we think that fare is too expensive, we predict it will drop, and we’ll alert you when it does.’ And we can give them that answer in less than one second.”

When consumers ask our smartphone app whether they should buy now or wait, we can tell them, ‘yes, that’s a good deal, buy it now’.

— Patrick Surry, chief data scientist, Hopper

While trying to predict airfare trends is nothing new, Hopper has told TechCrunch that it can not only save users up to 40% on airfares but it can also find them the lowest possible price 95% of the time. Surry says that’s all due to Hopper’s algorithms and data models.

The Hopper app launched on iOS in January 2015 and on Android eight months later. The company also switched in September 2015 from directing customers to external travel agencies to taking bookings directly through the app for a small fee. The Hopper app has already been downloaded to more than 2 million phones worldwide.

Surry predicts that we’ll soon see sophisticated chatbots that can start with vague requests from customers like “I want to go somewhere warm in February for less than $500,” proceed to ask questions that help users narrow their options, and finally book a trip that meets all their desired parameters. Eventually, he says, these chatbots will be able to handle millions of interactions simultaneously, allowing a wide variety of companies to reassign human call center agents to the handling of high-value transactions and exceptions to the rules built into the digitized booking process.

Port of Hamburg Lets the Machines Untangle Complexity

In early 2015, AI experts told Wired magazine that at least another 10 years would pass before a computer could best the top human players at Go, an ancient game that’s exponentially harder than chess. Yet before the end of that same year, Wired also reported that machine learning techniques drove Google’s AlphaGo AI to win four games out of five against one of the world’s top Go players. This feat proves just how good algorithms have become at managing extremely complex situations with multiple interdependent choices, Saravana points out.

The Port of Hamburg, which has digitized traffic management for an estimated 40,000 trucks a day, is a good example. In the past, truck drivers had to show up at the port to check traffic and parking message boards. If they arrived before their ships docked, they had to drive around or park in the neighboring residential area, contributing to congestion and air pollution while they waited to load or unload. Today, the HPA’s smartPORT mobile app tracks individual trucks using telematics. It customizes the information that drivers receive based on location and optimizes truck routes and parking in real time so drivers can make more stops a day with less wasted time and fuel.

The platform that drives the smartPORT app also uses sensor data in other ways: it tracks wind speed and direction and transmits the data to ship pilots so they can navigate in and out of the port more safely. It monitors emissions and their impact on air quality in various locations in order to adjust operations in real time for better control over environmental impact. It automatically activates streetlights for vehicle and pedestrian traffic, then switches them off again to save energy when the road is empty. This ability to coordinate and optimize multiple business functions on the fly makes the Port of Hamburg a textbook example of a Live Business.

Digitization Is Not Bounded by Industry

Other retail and B2B businesses of all types will inevitably join the Port of Hamburg in further digitizing processes, both in predictable ways and in those we can only begin to imagine.

sap_Q216_digital_double_feature1_images4Customer service, for example, is likely to be in the vanguard. Automated systems already feed information about customers to online and phone-based service representatives in real time, generate cross-selling and upselling opportunities based on past transactions, and answer customers’ frequently asked questions. Saravana foresees these systems becoming even more sophisticated, powered by AI algorithms that are virtually indistinguishable from human customer service agents in their ability to handle complex live interactions in real time.

In manufacturing and IT, Sven Bauszus, global vice president and general manager for predictive analytics at SAP, forecasts that sensors and predictive analysis will further automate the process of scheduling and performing maintenance, such as monitoring equipment for signs of failure in real time, predicting when parts or entire machines will need replacement, and even ordering replacements preemptively. Similarly, combining AI, sensors, data mining, and other technologies will enable factories to optimize workforce assignments in real time based on past trends, current orders, and changing market conditions.

Public health will be able to go live with technology that spots outbreaks of infectious disease, determines where medical professionals and support personnel are needed most and how many to send, and helps ensure that they arrive quickly with the right medication and equipment to treat patients and eradicate the root cause. It will also make it easier to track communicable illnesses, find people who are symptomatic, and recommend approaches to controlling the spread of the illness, Bauszus says.

He also predicts that the insurance industry, which has already begun to digitize its claims-handling processes, will refine its ability to sort through more claims in less time with greater accuracy and higher customer satisfaction. Algorithms will be better and faster at flagging claims that have a high probability of being fraudulent and then pushing them to claims inspectors for investigation. Simultaneously, the same technology will be able to identify and resolve valid claims in real time, possibly even cutting a check or depositing money directly into the insured person’s bank account within minutes.

Financial services firms will be able to apply machine learning, data mining, and AI to accelerate the process of rating borrowers’ credit and detecting fraud. Instead of filling out a detailed application, consumers might be able to get on-the-spot approval for a credit card or loan after inputting only enough information to be identified. Similarly, banks will be able to alert customers to suspicious transactions by text message or phone call—not within a day or an hour, as is common now, but in a minute or less.

Pitfalls and Possibilities

As intelligent as business processes can be programmed to be, there will always be a point beyond which they have to be supervised. Indeed, Saravana forecasts increasing regulation around when business processes can and can’t be digitized. Especially in areas involving data security, physical security, and health and safety, it’s one thing to allow machines to parse data and arrive at decisions to drive a critical business process, but it’s another thing entirely to allow them to act on those decisions without human oversight.

Automated, impersonal decision making is fine for supply chain automation, demand forecasting, inventory management, and other processes that need faster-than-human response times. In human-facing interactions, though, Saravana insists that it’s still best to digitize the part of the process that generates decisions, but leave it to a human to finalize the decision and decide how to put it into action.

“Any time the interaction is machine-to-machine, you don’t need a human to slow the process down,” he says. “But when the interaction involves a person, it’s much more tricky, because people have preferences, tastes, the ability to try something different, the ability to get fatigued—people are only statistically predictable.”

For example, technology has made it entirely possible to build a corporate security system that can gather information from cameras, sensors, voice recognition technology, and other IP-enabled devices. The system can then feed that information in a steady stream to an algorithm designed to identify potentially suspicious activity and act in real time to prevent or stop it while alerting the authorities. But what happens when an executive stays in the office unusually late to work on a presentation and the security system misidentifies her as an unauthorized intruder? What if the algorithm decides to lock the emergency exits, shut down the executive’s network access, or disable her with a Taser instead of simply sending an alert to the head of security asking what to do while waiting for the police to come?

sap_Q216_digital_double_feature1_images6The Risk Is Doing Nothing

The greater, if less dramatic, risk associated with digitizing business processes is simply failing to pursue it. It’s true that taking advantage of new digital technologies can be costly in the short term. There’s no question that companies have to invest in hardware, software, and qualified staff in order to prepare enormous data volumes for storage and analysis. They also have to implement new data sources such as sensors or Internet-connected devices, develop data models, and create and test algorithms to drive business processes that are currently analog. But as with any new technology, Saravana advises, it’s better to start small with a key use case, rack up a quick win with high ROI, and expand gradually than to drag your heels out of a failure to grasp the long-term potential.

The economy is digitizing rapidly, but not evenly. According to the McKinsey Global Institute’s December 2015 Digital America report, “The race to keep up with technology and put it to the most effective business use is producing digital ‘haves’ and ‘have-mores’—and the large, persistent gap between them is becoming a decisive factor in competition across the economy.” Companies that want to be among the have-mores need to commit to Live Business today. Failing to explore it now will put them on the wrong side of the gap and, in the long run, rack up a high price tag in unrealized efficiencies and missed opportunities. D!

Comments

Erik Marcade

About Erik Marcade

Erik Marcade is vice president of Advanced Analytics Products at SAP.

Tags:

Digital Transformation Needs More Than Technology (Part 2)

Andreas Hauser

In my last blog, I explained why design, design thinking, and experience matter in digital transformation, which goes beyond pure technology and business skills. The credo is to engage with your customers, and most importantly, with the users right from the beginning, in an iterative, user-centric design process.

Digital transformation is more than a one-time project; it is a journey. Ultimately, enterprises want to prepare their organization for a sustainable design-led digital transformation.

But how to achieve this? That is the focus of today’s blog.

Foster a design-led innovation culture

Changing a company culture is not easy. If you don’t want your company ending up like Nokia or Blackberry, you better start sooner than later experiencing new practices. More and more companies are training their people on design thinking and want to establish a culture of design-led innovation.Slide3.JPG

My formula for innovation culture is people + process + place.

You need to have the right skills in your organization (people): Business, technical and design skills to better understand the needs of customers and users. Business and technical skills are typically not the problem. But how many people with design background do you have in your organization? This is why more and more companies start hiring designers and train their employees on design thinking.

It is not just about having the people with the right skills. You also need to change the way (process) how you engage with customer and users. To be successful, you need to combine design thinking with agile methodologies. The process is pretty simple: You need to get people with the right skills together working as one team and iterate from the beginning to the end with the customers and users. Sounds simple, but it is sometimes difficult to execute in large global organizations.

Let me tell you a story.

We have trained about 300 people (business + IT) at one of our customers in design thinking and helped to establish five design thinking coaches in their organization. This was the most interesting outcome: After the exercise, seven out of 10 IT projects were initiated by the business. The customer told us that this was the first time where business proactively wanted to work with IT. This is a great start to improve a relationship which in recent years has gotten separated in many organizations.

In our experience, the place where people work together also has a huge impact on creativity. Therefore, we have established at SAP “customer-facing” co-innovation spaces – called SAP AppHaus – where customers and SAP collaborate and co-innovate as partners. Establishing creative spaces within your organization gives the cultural change a face. Skills and new ways of working are often not very visible. The space is physical and people see and feel that your organization is changing. Check out our virtual walk-through our AppHaus in Heidelberg.

In my last blog, I discussed our co-innovation journey with Mercedes-AMG, which paved Mercedes-AMG’s way to a sustainable digital transformation. Based on my experience, you first need to show with a lighthouse project that this methodology creates business value for the company. You can then build on this success and start the journey to establish a sustainable design-led culture in your organization.

Be prepared for a long journey. It takes time to change and influence the way how organizations work. There has never been a more exciting time for designers, because the industry is starting to see the huge value they can bring to organizations and the value they can bring for a successful digital transformation.

Have a look at a more detailed presentation and a video recording about the concept that “digital transformation is more than technology.” Learn more about customer success stories on the UX Design Services web page.

Stay tuned for more articles as part of this blog series, in which you can explore further perspectives on digital transformation and its various aspects, learn about organizational readiness for design thinking, and assess how ready your organization is to embark on this journey.

Read Part 1 of this discussion: Digital Transformation Needs More Than Technology.

This article originally appeared on SAP Business Trends.

Comments

Andreas Hauser

About Andreas Hauser

Andreas is global head of the design and co-innovation center at SAP. His team drives customer & strategic design projects through Co-Innovation and Design Thinking. Before he was Vice President of User Experience at SAP SE for OnDemand Solutions.