Sections

#GartnerBI and Big Data: Fear, Loathing, and Business Breakthroughs

Timo Elliott

big data

At the Gartner Business Intelligence and Analytics Summit in Barcelona, there was near universal agreement about three things to do with “big data”

  1. It’s an awful term (but we’re stuck with it)
  2. Whatever it means, it’s a big deal, and requires big changes to traditional information infrastructures
  3. It will result in big new business opportunities

“Big Data” is a terrible term

Gartner analyst Doug Laney first coined the term “big data” over over 12 years ago (at least in its current form – people have been complaining about “information overload” since Roman times). But the term’s meaning is still far from clear and it was nominated the #1 “tech buzzword that everyone uses but don’t quite understand” (followed closely by “cloud”).

When using the term, Gartner usually keeps the quote marks in place (i.e. it’s “big data”, not big data). Here’s the definition provided by analyst Ted Friedman to “de-hype” the term during the summit keynote:

“Big Data” are high volume, velocity and variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision-making”

Analyst Donald Feinberg warned people that “talking only about big data can lead to self-delusion” and urged people not to “surrender to the hype-ocracy.” He left left no doubt over where he stood on the use of the term: “Big data doesn’t mean MapReduce or Hadoop. Big data doesn’t exist, it’s meaningless, it’s ridiculous…” The audience started applauding, to which he replied: “Why are you clapping?! Why do you all fall for it? Why do the vendors do it?!

As SAP’s Jason Rose? put it “how can we demystify this? …easy, drop the ‘big’. Data has always been the key challenge in BI”.

For what it’s worth, here’s my tongue-in-cheek definition, and it’s worth noting that big data is quickly becoming the default term for what we used to call analytics or business intelligence.

But Big Data is a Big Deal

Despite the problems, Doug Laney noted that big data the most-searched-for term on Gartner.com.  Why is it so popular? Maybe because it’s so nebulous that people want to check if they have understood it. Or maybe because there’s no other more precise term to indicate the new analytic opportunities. And maybe because “hype has a value” as Ted Friedman put it: big data has proved to be a new opportunity to talk to business people about the power of analytics, and because everybody’s searching for it, vendors would be crazy not to include it in their marketing.

Conference attendees generally believed that the biggest opportunity for big data analysis was new insights from “dark data” that lies unused within organizations today. Gartner highlighted the dangers of implementing shiny new big data technology, separate from existing analytics infrastructures: “Do not make your big data implementations siloed. Make them part of the overall strategy for BI.” said Ted Friedman. “Link to stuff you are already doing. Don’t make big data a standalone thing. And don’t feel like you’ve got to go out and buy a whole new technology stack.”

Analyst Rita Sallam, in a session on data variety, gave some examples of the new opportunities:

Some of Gartner’s public predictions related to big data:

  • By 2015, 65 percent of packaged analytic applications with advanced analytics will come embedded with Hadoop.
  • By 2016, 70 percent of leading BI vendors will have incorporated natural-language and spoken-word capabilities.
  • By 2015, more than 30 percent of analytics projects will deliver insights based on structured and unstructured data.

But What Does It Mean For The Business?

Donald Fienberg: “Realize that big data is not about doing ‘more’ of the same thing – it’s about doing things differently”. “The major opportunities for big data are around ways to transform the business and disrupt the industry” said Doug Laney. These included radically changing existing business processes, introducing new, more-personalized products and services, and “answering chewy questions that weren’t possible before.” Some examples:

  • NetFlix did deep analysis of their viewers’ preferences, and used that to craft the new “House of Cards” TV series – a $100M investment
  • New financial lenders are using big data to find untapped banking opportunities – including lending scores based on what you say on social media
  • Passur uses big data to provides real-time monitoring of air traffic, to potentially save millions of dollars per year. Today, pilots’ estimated times of arrival are off by more than ten minutes ten percent of the time, and five minutes 30% of the time – knowing exactly when the planes will arrive means more automation, better operating efficiencies, improved security, etc.
  • Enologix analyzes the chemical composition of new wines to predict wine spectator score, and offer advice on how to improve the score
  • Dollar General, Kroger and other retailers provide data to partners to analyze, for “free strategic advice”
  • Insurance companies are using text mining on previously-unexamined “dark data” on claims forms to sniff out indicators of fraud
Mats-Olov Eriksson of King.com, a Sweden-based casual gaming site (card games, social games, etc), gave an entertaining presentation called “Beyond Big and Data”, hosted by recently-returned Gartner analyst Frank Buytendijk. It was illustrated with more cute kittens, puppies, and otters than I’ve ever seen before in a technology conference!

kingcom

Mat’s title is “Spiritual Leader, Data Warehouse”, and he explained how the company collects massive amounts of log data about user activities, then uses a combination of Hadoop, traditional data warehousing, and BI cubes to optimize the player experience – and the company’s profits.

The company works with Facebook on games like Candy Crush Saga. The company’s main revenue comes from in-app purchases: for example, you might get five free lives in a game, and if you run out you can either wait for a period, or pay a small sum to play immediately. To try to figure out what options generate the most revenue, and keep players in the game, the company currently processes 60,000 log files per day. The files contain around three billion “events” per day, and the company expects that to rise to six billion by the end of this year. Mats explained that big data is best compared to a software development project, requiring hard-to-find skilled personnel. He characterized hadoop as a “powerful but immature five year old” and explained that they used a MySQL data mart for low-latency access: “HIVE is anything but low latency – it takes fifteen seconds before you even get an error message!”.

One of the recent challenges the company has faced is information governance – a term they hadn’t even used until six months ago. Mats noted that the European privacy laws were by far the “harshest” they deal with.

In a separate presentation, that I unfortunately didn’t attend, Tom Fastner of eBay gave an overview of his company’s analytic systems, explaining that they process more than 100 Petabytes of data I/Os every day, and a “singularity” keyvalue store on Teradata manages 36 Petabytes of data. The largest table is 3 Petabytes / 3 trillion rows, and only takes 32 seconds to scan. Here’s a presentation from 2011 that talks about how eBay combines structured, semi-structured, and unstructured data.

The inimitable Steve Lucas says of big data: “when you see it, you know it”. MKI, a bioscience company is using big data to analyze genome data, using a combination of hadoop, open-source “R” algorithms, and SAP’s in-memory platform, SAP HANA (registration required).

image

image

image

Here’s a summary of the MKI project:

In Conclusion

Big data is a lousy term, but offers big opportunities in return for some big information infrastructure changes. What does the future hold? Let’s hope less of the hype, and more of the business change…

 

Comments

About Timo Elliott

Timo Elliott is an innovation evangelist and international conference speaker who has presented to business and IT audiences in over forty countries around the world. A 23-year veteran of SAP BusinessObjects, Elliott works closely with SAP development and innovation centers around the world on new technology directions. His popular Business Analytics blog at timoelliott.com tracks innovation in analytics and social media, including topics such as big data, collaborative decision-making, and social analytics. Prior to Business Objects, Elliott was a computer consultant in Hong Kong and led analytics projects for Shell in New Zealand. He holds a first-class honors degree in Economics with Statistics from Bristol University, England.

Hockey Fans Rejoice! SAPPHIRE NOW And ASUG Showcase The Ultimate Fan Experiences

Fred Isbell

Customer sessions at the SAPPHIRE NOW and ASUG conferences are always fascinating, spanning across the many industries and detailing the latest in best practices, technology, and innovative thinking. No one is left out – not even the most loyal hockey fan.

Last year, SAP CEO Bill McDermott brought the Stanley Cup – the “holy grail” of the National Hockey League (NHL) – on stage. Talk about an iconic moment! Bill was highlighting the new SAP analytics solution, which is currently enabling the NHL to calculate statistics on the SAP HANA platform and with a cloud-based solution. In fact, the NHL Network and NHL on NBC broadcasts have featured player profiles with insights generated by SAP solutions throughout this season, including the playoffs currently in progress.

Although the Stanley Cup didn’t pay a return visit to the SAPPHIRE NOW stage this year, I wasn’t at all disappointed after attending customer sessions featuring two elite hockey teams: Mannheim Adler Eagles and the San Jose Sharks.

The Mannheim Adler Eagles soar high

Mannheim is a German city that’s home to 400,000 people and the Mannheim Adler Eagles, a very successful professional ice hockey team in the German Hockey League (DEL). After winning the DEL championship seven times and hoisting the league’s version of the Stanley Cup last year, the Adler Eagles have been enjoying the limSANOW Hockey-1elight.

But with great success comes with some challenges. With an arena capacity of 13,500 fans, the team has 7,600 season ticket holders and averages 11,300 fans per home game.

The problem? The Adler Eagles didn’t know its fans as well as they would like.

And rising smartphone use and an increasing level of game fixation made this issue more apparent. The team’s organization needed to dramatically increase and tighten fan engagement before, during, and after games.

The solution was a fan app specifically designed for the Adler Eagles that runs on both Apple iOS and Android platforms. The mobile app rewards loyalty and engages fans with a coupon-center approach, where points accrued through team-related activities and ticket purchases can be used for discounted merchandise. Fans even receive points when the Eagles score a goal!

Overall, this approach is helping the Adler Eagles improve fan engagement, bring fans to the team store, and drive higher revenue. Unlike before, the team now has a much more complete profile and analysis of its fans available based on specific and real-time fan behavior. And with this information, the organization is rewarding loyalty, developing a core fan base within and beyond its season ticket holders, and personalizing marketing campaigns with greater ease.

Because this app has been so successful, Mannheim is even considering a version for the Apple Watch. The team is looking to expand its SAP solution by using wearables and sensor-based analytics that can help optimize team performance. At the same time, it will also increase its use of statistics with SAP solutions as the engine for data collection.

San Jose Sharks seize unprecedented customer insights

The San Jose Sharks, this year celebrating the team’s 25th year in existence, are one of the premier NHL franchises. Hasso Plattner, CEO of the SAP Executive Board, is the majority owner and holds a seat on the NHL Board of Governors. This is an excellent example of a midsized business with a lot happening around it – and with all of that activity comes a need for new systems.

The Sharks’ major business challenges were not unlike those of other fast-moving organizations facing new, expanded business needs. The Sharks’ business software systems were not up to speed with the business, compounded by a ticket-sales system that acted like an engine for lead management. Furthermore, every application was isolated and used as a standalone, creating islands of automation that made room for error and duplicate data.

SANOW Hockey-2

Consequently, there was an inherently inefficient sales process – and as we saw with the Adler Eagles, limited visibility into season ticket holders, including their renewal intent. The Sharks lacked one-on-one customer centricity and any understanding of the wider fan base. This situation significantly impaired the team’s sales pipeline and forecasts.

By turning to a solution for enhancing customer and fan engagement with a business goal of using data to turn insights into action, the team now has a holistic and 360-degree view of the customer while leveraging simplified data acquisition quickly in real time.  Responding to previous islands of automation, the new solution rapidly translates and converts information into value-added sales insight – giving the organization a single version of the truth.

The results the San Jose Sharks achieved were impressive. More than 70 people are now using the system and a quick rollout made for minimal impacts on its IT infrastructure. The team quickly realized improved sales and service processes, efficient lead management, and improved forecasting within its sales pipeline management. Season ticket renewals were at the highest level ever, thanks to better understanding of the fan experience and the ticket holders’ pains and expectations. Plus, the team is driving specific and targeted campaigns based on specific customer segments that result from the team data.

The Mannheim Adler Eagles and San Jose Sharks showcased innovative application of technology to their core sports business. And as always, the fans are the winners as they get to experience the coolest game on earth with two very successful and forward-thinking hockey teams.

For more, check our research brief on The Future of Sports Marketing: Play Locally, Think Globally, Drive Loyalty.

Comments

About Fred Isbell

Fred Isbell is the Senior Director of SAP Digital Business Services Marketing at SAP. He is an experienced, results- and goal-oriented senior marketing executive with broad and extensive experience & expertise in high technology and marketing. He has a BA from Yale and an MBA from the Duke Fuqua School of Business.

How Big Data Is Changing The News Industry

Maggie Chan Jones

In the runup to the U.S. presidential election, newsrooms are working at a fever pitch. But if we slow down a minute to take a closer look at modern-day news organizations, we might ask ourselves: Can they really provide accurate, unbiased information on current events at Twitter speed?

News and the art of gathering it has evolved exponentially in the last few years. How the news is consumed is also light years away from where it was a decade ago. The explosive growth of the Internet and mobile devices has anyone and everyone broadcasting their opinions. The former broadcast news landscape has shattered into millions of different sources, platforms, and feeds, each using curated content models that cater to the reader, allowing them to pick and choose their sources.

With the expanding market of content platforms and multichannel news sources has come a myriad of perspectives. Does having this choice of who we listen to – or don’t listen to – make us unintentionally biased? This question is incredibly important to consider when we as a society come together to make informed decisions that impact everyone’s future.

Today’s major news organizations are balancing two realities. One is civic responsibility for reliable, responsible journalism. The other is profitability that mandates speedy content for readers on the go. This has forced news providers to become data-driven machines – seamlessly reacting across browsers, mobile screens, and social feeds 24×7. The imperative for speed has trumped traditional ways of reporting news. Data algorithms now drive content. Data-driven research and statistics have become an important source to supplement the day’s news. Third-party data tools are being used.

But this new focus on Big Data is also a curse. A petabyte of unprocessed, unstructured data is almost as useful as having no data at all. That’s why better tools to manage Big Data and stronger data algorithms are needed to create content that can benefit today’s readers. This is an important initiative for SAP, and we’re providing technology that is already impacting the way news is prepared and consumed for important current events, such as the upcoming U.S. presidential election.

As the exclusive sponsor of Reuters’ Polling Explorer, SAP is working with Reuters to provide journalists and consumers the latest polling data, stories about the election, and more. Real-time data is fueling Reuters with the tools needed to execute news with accuracy, speed, and integrity. The new polling explorer increased their readers’ engagement from 240K visits for all of 2012 election cycle to 6.2M just in the first four months since launch in November. The Reuters election app uses the new data system to match users with the candidate who best fits with their own political leanings. And Reuters can also use software to inform polling data and other data sets into data visualizations that provide facts and stats in a dynamic, interactive manner.

By providing technology platforms that are easy to use and scalable for any sized business, technology providers can give news providers across the world a trove of insights that impact their readers in real time, especially during momentous, breaking news cycles.

For more insight on the power of Big Data, see The Risk And Reward Of Big Data.

This story originally appeared on SAP Business Trends

Comments

Maggie Chan Jones

About Maggie Chan Jones

Maggie Chan Jones is CMO of SAP, responsible for leading SAP’s global advertising and brand experience, customer audience marketing, and field and partner marketing functions across all markets. Her mission is to bring to life SAP’s vision to help the world run better and improve people’s lives through storytelling, and to accelerate company growth. A career-marketer in the technology industry, Maggie has held a succession of roles at Microsoft, Sun Microsystems and other technology companies.

Unlock Your Digital Super Powers: How Digitization Helps Companies Be Live Businesses

Erik Marcade and Fawn Fitter

The Port of Hamburg handles 9 million cargo containers a year, making it one of the world’s busiest container ports. According to the Hamburg Port Authority (HPA), that volume doubled in the last decade, and it’s expected to at least double again in the next decade—but there’s no room to build new roads in the center of Hamburg, one of Germany’s historic cities. The port needed a way to move more freight more efficiently with the physical infrastructure it already has.

sap_Q216_digital_double_feature1_images1The answer, according to an article on ZDNet, was to digitize the processes of managing traffic into, within, and back out of the port. By deploying a combination of sensors, telematics systems, smart algorithms, and cloud data processing, the Port of Hamburg now collects and analyzes a vast amount of data about ship arrivals and delays, parking availability, ground traffic, active roadwork, and more. It generates a continuously updated model of current port conditions, then pushes the results through mobile apps to truck drivers, letting them know exactly when ships are ready to drop off or receive containers and optimizing their routes. According to the HPA, they are now on track to handle 25 million cargo containers a year by 2025 without further congestion or construction, helping shipping companies bring more goods and raw materials in less time to businesses and consumers all across Europe.

In the past, the port could only have solved its problem with backhoes and building permits—which, given the physical constraints, means the problem would have been unsolvable. Today, though, software and sensors are allowing it to improve processes and operations to a previously impossible extent. Big Data analysis, data mining, machine learning, artificial intelligence (AI), and other technologies have finally become sophisticated enough to identify patterns not just in terabytes but in petabytes of data, make decisions accordingly, and learn from the results, all in seconds. These technologies make it possible to digitize all kinds of business processes, helping organizations become more responsive to changing market conditions and more able to customize interactions to individual customer needs. Digitization also streamlines and automates these processes, freeing employees to focus on tasks that require a human touch, like developing innovative strategies or navigating office politics.

In short, digitizing business processes is key to ensuring that the business can deliver relevant, personalized responses to the market in real time. And that, in turn, is the foundation of the Live Business—a business able to coordinate multiple functions in order to respond to and even anticipate customer demand at any moment.

Some industries and organizations are on the verge of discovering how business process digitization can help them go live. Others have already started putting it into action: fine-tuning operations to an unprecedented level across departments and at every point in the supply chain, cutting costs while turbocharging productivity, and spotting trends and making decisions at speeds that can only be called superhuman.

Balancing Insight and Action

sap_Q216_digital_double_feature1_images2Two kinds of algorithms drive process digitization, says Chandran Saravana, senior director of advanced analytics at SAP. Edge algorithms operate at the point where customers or other end users interact directly with a sensor, application, or Internet-enabled device. These algorithms, such as speech or image recognition, focus on simplicity and accuracy. They make decisions based primarily on their ability to interpret input with precision and then deliver a result in real time.

Edge algorithms work in tandem with, and sometimes mature into, server-level algorithms, which report on both the results of data analysis and the analytical process itself. For example, the complex systems that generate credit scores assess how creditworthy an individual is, but they also explain to both the lender and the credit applicant why a score is low or high, what factors went into calculating it, and what an applicant can do to raise the score in the future. These server-based algorithms gather data from edge algorithms, learn from their own results, and become more accurate through continuous feedback. The business can then track the results over time to understand how well the digitized process is performing and how to improve it.

sap_Q216_digital_double_feature1_images5From Data Scarcity to a Glut

To operate in real time, businesses need an accurate data model that compares what’s already known about a situation to what’s happened in similar situations in the past to reach a lightning-fast conclusion about what’s most likely to happen next. The greatest barrier to this level of responsiveness used to be a lack of data, but the exponential growth of data volumes in the last decade has flipped this problem on its head. Today, the big challenge for companies is having too much data and not enough time or power to process it, says Saravana.

Even the smartest human is incapable of gathering all the data about a given situation, never mind considering all the possible outcomes. Nor can a human mind reach conclusions at the speed necessary to drive Live Business. On the other hand, carefully crafted algorithms can process terabytes or even petabytes of data, analyze patterns and detect outliers, arrive at a decision in seconds or less—and even learn from their mistakes (see How to Train Your Algorithm).

How to Train Your Algorithm 

The data that feeds process digitization can’t just simmer.
It needs constant stirring.

Successfully digitizing a business process requires you to build a model of the business process based on existing data. For example, a bank creates a customer record that includes not just the customer’s name, address, and date of birth but also the amount and date of the first deposit, the type of account, and so forth. Over time, as the customer develops a history with the bank and the bank introduces new products and services, customer records expand to include more data. Predictive analytics can then extrapolate from these records to reach conclusions about new customers, such as calculating the likelihood that someone who just opened a money market account with a large balance will apply for a mortgage in the next year.

Germany --- Germany, Lower Bavaria, Man training English Springer Spaniel in grass field --- Image by © Roman M‰rzinger/Westend61/CorbisTo keep data models accurate, you have to have enough data to ensure that your models are complete—that is, that they account for every possible predictable outcome. The model also has to push outlying data and exceptions, which create unpredictable outcomes, to human beings who can address their special circumstances. For example, an algorithm may be able to determine that a delivery will fail to show up as scheduled and can point to the most likely reasons why, but it can only do that based on the data it can access. It may take a human to start the process of locating the misdirected shipment, expediting a replacement, and establishing what went wrong by using business knowledge not yet included in the data model.

Indeed, data models need to be monitored for relevance. Whenever the results of a predictive model start to drift significantly from expectations, it’s time to examine the model to determine whether you need to dump old data that no longer reflects your customer base, add a new product or subtract a defunct one, or include a new variable, such as marital status or length of customer relationship that further refines your results.

It’s also important to remember that data doesn’t need to be perfect—and, in fact, probably shouldn’t be, no matter what you might have heard about the difficulty of starting predictive analytics with lower-quality data. To train an optical character recognition system to recognize and read handwriting in real time, for example, your samples of block printing and cursive writing data stores also have to include a few sloppy scrawls so the system can learn to decode them.

On the other hand, in a fast-changing marketplace, all the products and services in your database need consistent and unchanging references, even though outside the database, names, SKUs, and other identifiers for a single item may vary from one month or one order to the next. Without consistency, your business process model won’t be accurate, nor will the results.

Finally, when you’re using algorithms to generate recommendations to drive your business process, the process needs to include opportunities to test new messages and products against existing successful ones as well as against random offerings, Saravana says. Otherwise, instead of responding to your customers’ needs, your automated system will actually control their choices by presenting them with only a limited group of options drawn from those that have already received the most
positive results.

Any process is only as good as it’s been designed to be. Digitizing business processes doesn’t eliminate the possibility of mistakes and problems; but it does ensure that the mistakes and problems that arise are easy to spot and fix.

From Waste to Gold

Organizations moving to digitize and streamline core processes are even discovering new business opportunities and building new digitized models around them. That’s what happened at Hopper, an airfare prediction app firm in Cambridge, Massachusetts, which discovered in 2013 that it could mine its archives of billions of itineraries to spot historical trends in airfare pricing—data that was previously considered “waste product,” according to Hopper’s chief data scientist, Patrick Surry.

Hopper developed AI algorithms to correlate those past trends with current fares and to predict whether and when the price of any given flight was likely to rise or fall. The results were so accurate that Hopper jettisoned its previous business model. “We check up to 3 billion itineraries live, in real time, each day, then compare them to the last three to four years of historical airfare data,” Surry says. “When consumers ask our smartphone app whether they should buy now or wait, we can tell them, ‘yes, that’s a good deal, buy it now,’ or ‘no, we think that fare is too expensive, we predict it will drop, and we’ll alert you when it does.’ And we can give them that answer in less than one second.”

When consumers ask our smartphone app whether they should buy now or wait, we can tell them, ‘yes, that’s a good deal, buy it now’.

— Patrick Surry, chief data scientist, Hopper

While trying to predict airfare trends is nothing new, Hopper has told TechCrunch that it can not only save users up to 40% on airfares but it can also find them the lowest possible price 95% of the time. Surry says that’s all due to Hopper’s algorithms and data models.

The Hopper app launched on iOS in January 2015 and on Android eight months later. The company also switched in September 2015 from directing customers to external travel agencies to taking bookings directly through the app for a small fee. The Hopper app has already been downloaded to more than 2 million phones worldwide.

Surry predicts that we’ll soon see sophisticated chatbots that can start with vague requests from customers like “I want to go somewhere warm in February for less than $500,” proceed to ask questions that help users narrow their options, and finally book a trip that meets all their desired parameters. Eventually, he says, these chatbots will be able to handle millions of interactions simultaneously, allowing a wide variety of companies to reassign human call center agents to the handling of high-value transactions and exceptions to the rules built into the digitized booking process.

Port of Hamburg Lets the Machines Untangle Complexity

In early 2015, AI experts told Wired magazine that at least another 10 years would pass before a computer could best the top human players at Go, an ancient game that’s exponentially harder than chess. Yet before the end of that same year, Wired also reported that machine learning techniques drove Google’s AlphaGo AI to win four games out of five against one of the world’s top Go players. This feat proves just how good algorithms have become at managing extremely complex situations with multiple interdependent choices, Saravana points out.

The Port of Hamburg, which has digitized traffic management for an estimated 40,000 trucks a day, is a good example. In the past, truck drivers had to show up at the port to check traffic and parking message boards. If they arrived before their ships docked, they had to drive around or park in the neighboring residential area, contributing to congestion and air pollution while they waited to load or unload. Today, the HPA’s smartPORT mobile app tracks individual trucks using telematics. It customizes the information that drivers receive based on location and optimizes truck routes and parking in real time so drivers can make more stops a day with less wasted time and fuel.

The platform that drives the smartPORT app also uses sensor data in other ways: it tracks wind speed and direction and transmits the data to ship pilots so they can navigate in and out of the port more safely. It monitors emissions and their impact on air quality in various locations in order to adjust operations in real time for better control over environmental impact. It automatically activates streetlights for vehicle and pedestrian traffic, then switches them off again to save energy when the road is empty. This ability to coordinate and optimize multiple business functions on the fly makes the Port of Hamburg a textbook example of a Live Business.

Digitization Is Not Bounded by Industry

Other retail and B2B businesses of all types will inevitably join the Port of Hamburg in further digitizing processes, both in predictable ways and in those we can only begin to imagine.

sap_Q216_digital_double_feature1_images4Customer service, for example, is likely to be in the vanguard. Automated systems already feed information about customers to online and phone-based service representatives in real time, generate cross-selling and upselling opportunities based on past transactions, and answer customers’ frequently asked questions. Saravana foresees these systems becoming even more sophisticated, powered by AI algorithms that are virtually indistinguishable from human customer service agents in their ability to handle complex live interactions in real time.

In manufacturing and IT, Sven Bauszus, global vice president and general manager for predictive analytics at SAP, forecasts that sensors and predictive analysis will further automate the process of scheduling and performing maintenance, such as monitoring equipment for signs of failure in real time, predicting when parts or entire machines will need replacement, and even ordering replacements preemptively. Similarly, combining AI, sensors, data mining, and other technologies will enable factories to optimize workforce assignments in real time based on past trends, current orders, and changing market conditions.

Public health will be able to go live with technology that spots outbreaks of infectious disease, determines where medical professionals and support personnel are needed most and how many to send, and helps ensure that they arrive quickly with the right medication and equipment to treat patients and eradicate the root cause. It will also make it easier to track communicable illnesses, find people who are symptomatic, and recommend approaches to controlling the spread of the illness, Bauszus says.

He also predicts that the insurance industry, which has already begun to digitize its claims-handling processes, will refine its ability to sort through more claims in less time with greater accuracy and higher customer satisfaction. Algorithms will be better and faster at flagging claims that have a high probability of being fraudulent and then pushing them to claims inspectors for investigation. Simultaneously, the same technology will be able to identify and resolve valid claims in real time, possibly even cutting a check or depositing money directly into the insured person’s bank account within minutes.

Financial services firms will be able to apply machine learning, data mining, and AI to accelerate the process of rating borrowers’ credit and detecting fraud. Instead of filling out a detailed application, consumers might be able to get on-the-spot approval for a credit card or loan after inputting only enough information to be identified. Similarly, banks will be able to alert customers to suspicious transactions by text message or phone call—not within a day or an hour, as is common now, but in a minute or less.

Pitfalls and Possibilities

As intelligent as business processes can be programmed to be, there will always be a point beyond which they have to be supervised. Indeed, Saravana forecasts increasing regulation around when business processes can and can’t be digitized. Especially in areas involving data security, physical security, and health and safety, it’s one thing to allow machines to parse data and arrive at decisions to drive a critical business process, but it’s another thing entirely to allow them to act on those decisions without human oversight.

Automated, impersonal decision making is fine for supply chain automation, demand forecasting, inventory management, and other processes that need faster-than-human response times. In human-facing interactions, though, Saravana insists that it’s still best to digitize the part of the process that generates decisions, but leave it to a human to finalize the decision and decide how to put it into action.

“Any time the interaction is machine-to-machine, you don’t need a human to slow the process down,” he says. “But when the interaction involves a person, it’s much more tricky, because people have preferences, tastes, the ability to try something different, the ability to get fatigued—people are only statistically predictable.”

For example, technology has made it entirely possible to build a corporate security system that can gather information from cameras, sensors, voice recognition technology, and other IP-enabled devices. The system can then feed that information in a steady stream to an algorithm designed to identify potentially suspicious activity and act in real time to prevent or stop it while alerting the authorities. But what happens when an executive stays in the office unusually late to work on a presentation and the security system misidentifies her as an unauthorized intruder? What if the algorithm decides to lock the emergency exits, shut down the executive’s network access, or disable her with a Taser instead of simply sending an alert to the head of security asking what to do while waiting for the police to come?

sap_Q216_digital_double_feature1_images6The Risk Is Doing Nothing

The greater, if less dramatic, risk associated with digitizing business processes is simply failing to pursue it. It’s true that taking advantage of new digital technologies can be costly in the short term. There’s no question that companies have to invest in hardware, software, and qualified staff in order to prepare enormous data volumes for storage and analysis. They also have to implement new data sources such as sensors or Internet-connected devices, develop data models, and create and test algorithms to drive business processes that are currently analog. But as with any new technology, Saravana advises, it’s better to start small with a key use case, rack up a quick win with high ROI, and expand gradually than to drag your heels out of a failure to grasp the long-term potential.

The economy is digitizing rapidly, but not evenly. According to the McKinsey Global Institute’s December 2015 Digital America report, “The race to keep up with technology and put it to the most effective business use is producing digital ‘haves’ and ‘have-mores’—and the large, persistent gap between them is becoming a decisive factor in competition across the economy.” Companies that want to be among the have-mores need to commit to Live Business today. Failing to explore it now will put them on the wrong side of the gap and, in the long run, rack up a high price tag in unrealized efficiencies and missed opportunities. D!

Comments

Erik Marcade

About Erik Marcade

Erik Marcade is vice president of Advanced Analytics Products at SAP.

Tags:

Strengthening Government Through Data Analytics

Dante Ricci

When it comes to analyzing data, you could say that there is a clash in culture due to disconnect within the government workforce. This is partly due to the fact that many organizations don’t have people in place with the right technical skill sets. But government can uncover hidden insights to drive better results and create more value for citizens.

The need has never been greater to empower knowledge workers with a comprehensive – yet simple – integrated platform that helps unlock the real value in data for smarter decision-making.

Governments move toward constituent-centered platforms

The fact is, leading government organizations have begun to transform by using consumer-grade solutions to garner better insights from data. The key lies in self-service and automated analytics that do not require technical skill sets. Such solutions enable government personnel at all levels to shift from asking IT for historical reports to a real-time and predictive view that considers multiple data points to deliver a personalized view.

Poised with the right technology and collaborative mindset, governments can uncover new insights to make life better, safer, and healthier, when:

  1. Technology is intuitive and easy to use.
  2. Personnel can make decisions based on a combination of historical and real-time data rather than decisions based on historical perspective alone.
  3. Collaborative technology can include constituent insight and ideas for better decision making.

Digital transformation of government removes that massive barrier between agencies and departments using a platform that shares data and removes the friction that slows down the entire process. The result is that agencies are able to do more, produce better results, and still save money. Digital by default is the key. The rewards are significant for those who successfully leverage analytics: stretching their competitive advantage, driving innovation, and improving lives.

Predictive solutions that appear before your eyes

Digitalized governments run frictionless with decisions based on real contextual insights. Analytics help leaders see problems before or as they occur. That real-time connection identifies potential problems and gives management time to correct them. As real-time data becomes available through input from sensors, transactions, constituents, and other information channels, decisions can be made at the moment of opportunity.

Putting it together

What happens when you need to make decisions, but your data is two years old? What if you need to rewrite a policy that focuses on performance and cost — but you have no information about costs?

Those sorts of problems occur every day. In the first scenario, your decision may be wrong because the data changed. In the second scenario, the policy update may be late. Both potential outcomes reflect negatively on performance and can negatively impact the safety and quality of citizens’ lives. These are both examples of the friction that occurs within governments. They are also the reasons why relevant and timely data is necessary.

The power and tools that a digital government wields are transformative. The rewards for government are many: lower costs, improved services, safer communities, and a better overall quality of life.  Services become seamless. Systems become fluid. Operational costs drop and better outcomes occur.

In short, you make better decisions when they are based on facts and context, not feelings. People who need help get help quickly. Operational issues become identified and fixed. People are happy. And isn’t that the way government should work?

Are you ready for change?

Read about more about SAP’s perspective on digital government here.

Comments

About Dante Ricci

Dante Ricci is the Global Public Services Marketing & Communications lead at SAP. His specialties include enterprise software, business strategy, business development, cloud computing and solution selling.