Sections

The Top 10 Trends In Analytics 2013

Timo Elliott

The Top 10 Trends In Analytics 2013

I’ve been passionate about analytics for over twenty years – but my head is still spinning with the amount of change currently going on in the analytics industry. Here’s my quick personal view of the top 10 trends in Analytics and Business Intelligence for 2013 — what did I miss?

1. Analytics And Business Intelligence Are Still #1

According to Gartner’s latest CIO survey, the top business priority is back to enterprise growth, and analytics and business intelligence remains the number one technology priority for 2013. And the next three technologies on the priority list (mobile, cloud, and collaboration) are all key areas for analytic innovation.

2. Increasing Analytic Maturity

Thanks to greater industry maturity and new technology opportunities, most organizations are making steps from Descriptive Analytics (“what happened?”) and Diagnostic Analytics (“why did it happen?”) towards Predictive Analytics (“what will happen”) – with Prescriptive Analytics (“how can we make it happen”) as the next frontier.

3. In-Memory is Ripping Up The Old Rules

In-memory computing is providing an opportunity to rethink information systems from scratch. According to Gartner, in-memory: “isn’t only about SAP HANA, isn’t new, isn’t unproven, isn’t only about big companies, and isn’t only about analytics”:

“In-memory computing will have a long-term disruptive impact by radically changing users’ expectations, application design principles, and vendor’s strategy”

4. Breaking Down Old Barriers

In-memory breaks down long-standing analytics barriers. For example, in-memory computing platform SAP HANA supports structured and unstructured data in a single system, and includes a sophisticated, embedded text analysis engine. Predictive or advanced analytics no longer requires a separate system – powerful analytic algorithms are available directly in-memory, without any unnecessary data movement, and thousands of times faster than disk-based predictive system.

5. Operations and Analytics Are No Longer Separate

For forty years, operational systems and analytic systems have been separate because of technology limitations. That’s now changing with in-memory platforms. For example, with SAP Business Suite on HANA, transactional data is written directly to memory, where it is instantly available without any of the analytic compromises that have plagued earlier “real-time” analytics.

6. Big Data is a Big Deal

In addition to traditional “transaction data”, it’s now feasible to analyze “interaction data” (events before, after, and around a transaction, such as the products that were considered but then not purchased) and “observation data” (such as data streamed from sensors). Algorithms such as MapReduce and projects such as Hadoop have introduced new opportunities for storing and analyzing data that was previously ignored because of technology limitations. Actuaries are finding new careers and glory as “data scientists”. These new technologies have more than proved their worth in niche or standalone systems, but need to better integrated with existing corporate environments.

7. Analytics Moves To The Core

Analytics is no longer an afterthought to your transaction systems — it’s the heart of your future information infrastructure. The data you are storing now you will still have in 15 or 20 years time, while your applications may be long gone. The next generation of information infrastructures will combine big data, transactional data, analytic data and “content” into a single, coherent set of services that Gartner calls an “information capabilities framework”:

“The information capabilities framework is the people-, process- and technology-agnostic set of capabilities needed to describe, organize, integrate, share and govern an organization’s information assets in an application-independent manner in support of its enterprise information management (EIM) goals.”

SAP is working on this vision with the “real time data platform”, combining SAP HANA with Hadoop, Sybase ASE, Sybase IQ, Sybase ESP – and (crucially) end-to-end information governance.

8. Optimizing the User Experience

Today’s information consumers demand the same ease-of-use and immediate access they get in the consumer world. Business people want to be able to grab and mix information on the fly, without having to wait for it to be loaded into a corporate data warehouse. Data discovery tools such as SAP Visual Intelligence cover this essential demand – without sacrificing the corporate needs for enterprise governance. And of course, people expect a smooth, mobile-ready BI experience with integrated social collaboration, and the option of using a cloud-based infrastructure.

9. Information as an Asset

Along with all the technology changes, there have been big changes to analytics culture. Information is no longer a byproduct of manufacturing processes – it is fast-becoming a key part of the products themselves. Today’s retailers and service providers want to offer “customer experiences” that are tailored to individuals, optimized for the moment, and coherent over time – and that requires powerful new data platforms. As information becomes part of revenue generation, interest in information and control over budgets are swiftly moving to the business units, rather than traditional IT. This is creating new opportunities, but also new IT pressures and organizational issues.

10. The Revenge of Information Governance

As the technology gets more and more powerful, it becomes even more important to fix one of the oldest and least tractable barriers to successful BI: the pain of integrating multiple sets of quality data. Better integration between “big data,” traditional analytic systems, and transaction systems must also involve investments in data governance and solutions such as SAP Information Steward.

What did I miss? Add a comment below…

The Next Round of the Analytics Revolution

If you’d like to find out more about any of these trends, don’t hesitate to contact me, and I’ll help point you to the best experts available. If you’re interested in SAP Analytics technology, should follow the Business Intelligence areas of the SAP Community Network, subscribe to the SAP Analytics Blog, follow @sapanalytics or @timoelliott on Twitter, and join us at the analytics campus of SAPPHIRE NOW and ASUG 2013 in Orlando, May 14-16 to explore industry changes in depth, hear about companies that are implementing analytics in new way, and talk face-to-face with the experts.

[Note that a version of this post originally appeared on the SAPPHIRE NOW area of the SAP Community Network]

Comments

About Timo Elliott

Timo Elliott is an innovation evangelist and international conference speaker who has presented to business and IT audiences in over forty countries around the world. A 23-year veteran of SAP BusinessObjects, Elliott works closely with SAP development and innovation centers around the world on new technology directions. His popular Business Analytics blog at timoelliott.com tracks innovation in analytics and social media, including topics such as big data, collaborative decision-making, and social analytics. Prior to Business Objects, Elliott was a computer consultant in Hong Kong and led analytics projects for Shell in New Zealand. He holds a first-class honors degree in Economics with Statistics from Bristol University, England.

Tags:

awareness

Data Analysts And Scientists More Important Than Ever For The Enterprise

Daniel Newman

The business world is now firmly in the age of data. Not that data wasn’t relevant before; it was just nowhere close to the speed and volume that’s available to us today. Businesses are buckling under the deluge of petabytes, exabytes, and zettabytes. Within these bytes lie valuable information on customer behavior, key business insights, and revenue generation. However, all that data is practically useless for businesses without the ability to identify the right data. Plus, if they don’t have the talent and resources to capture the right data, organize it, dissect it, draw actionable insights from it and, finally, deliver those insights in a meaningful way, their data initiatives will fail.

Rise of the CDO

Companies of all sizes can easily find themselves drowning in data generated from websites, landing pages, social streams, emails, text messages, and many other sources. Additionally, there is data in their own repositories. With so much data at their disposal, companies are under mounting pressure to utilize it to generate insights. These insights are critical because they can (and should) drive the overall business strategy and help companies make better business decisions. To leverage the power of data analytics, businesses need more “top-management muscle” specialized in the field of data science. This specialized field has lead to the creation of roles like Chief Data Officer (CDO).

In addition, with more companies undertaking digital transformations, there’s greater impetus for the C-suite to make data-driven decisions. The CDO helps make data-driven decisions and also develops a digital business strategy around those decisions. As data grows at an unstoppable rate, becoming an inseparable part of key business functions, we will see the CDO act as a bridge between other C-suite execs.

Data skills an emerging business necessity

So far, only large enterprises with bigger data mining and management needs maintain in-house solutions. These in-house teams and technologies handle the growing sets of diverse and dispersed data. Others work with third-party service providers to develop and execute their big data strategies.

As the amount of data grows, the need to mine it for insights becomes a key business requirement. For both large and small businesses, data-centric roles will experience endless upward mobility. These roles include data anlysts and scientists. There is going to be a huge opportunity for critical thinkers to turn their analytical skills into rapidly growing roles in the field of data science. In fact, data skills are now a prized qualification for titles like IT project managers and computer systems analysts.

Forbes cited the McKinsey Global Institute’s prediction that by 2018 there could be a massive shortage of data-skilled professionals. This indicates a disruption at the demand-supply level with the needs for data skills at an all-time high. With an increasing number of companies adopting big data strategies, salaries for data jobs are going through the roof. This is turning the position into a highly coveted one.

According to Harvard Professor Gary King, “There is a big data revolution. The big data revolution is that now we can do something with the data.” The big problem is that most enterprises don’t know what to do with data. Data professionals are helping businesses figure that out. So if you’re casting about for where to apply your skills and want to take advantage of one of the best career paths in the job market today, focus on data science.

I’m compensated by University of Phoenix for this blog. As always, all thoughts and opinions are my own.

For more insight on our increasingly connected future, see The $19 Trillion Question: Are You Undervaluing The Internet Of Things?

The post Data Analysts and Scientists More Important Than Ever For the Enterprise appeared first on Millennial CEO.

Comments

About Daniel Newman

Daniel Newman serves as the Co-Founder and CEO of EC3, a quickly growing hosted IT and Communication service provider. Prior to this role Daniel has held several prominent leadership roles including serving as CEO of United Visual. Parent company to United Visual Systems, United Visual Productions, and United GlobalComm; a family of companies focused on Visual Communications and Audio Visual Technologies. Daniel is also widely published and active in the Social Media Community. He is the Author of Amazon Best Selling Business Book "The Millennial CEO." Daniel also Co-Founded the Global online Community 12 Most and was recognized by the Huffington Post as one of the 100 Business and Leadership Accounts to Follow on Twitter. Newman is an Adjunct Professor of Management at North Central College. He attained his undergraduate degree in Marketing at Northern Illinois University and an Executive MBA from North Central College in Naperville, IL. Newman currently resides in Aurora, Illinois with his wife (Lisa) and his two daughters (Hailey 9, Avery 5). A Chicago native all of his life, Newman is an avid golfer, a fitness fan, and a classically trained pianist

When Good Is Good Enough: Guiding Business Users On BI Practices

Ina Felsheim

Image_part2-300x200In Part One of this blog series, I talked about changing your IT culture to better support self-service BI and data discovery. Absolutely essential. However, your work is not done!

Self-service BI and data discovery will drive the number of users using the BI solutions to rapidly expand. Yet all of these more casual users will not be well versed in BI and visualization best practices.

When your user base rapidly expands to more casual users, you need to help educate them on what is important. For example, one IT manager told me that his casual BI users were making visualizations with very difficult-to-read charts and customizing color palettes to incredible degrees.

I had a similar experience when I was a technical writer. One of our lead writers was so concerned with readability of every sentence that he was going through the 300+ page manuals (yes, they were printed then) and manually adjusting all of the line breaks and page breaks. (!) Yes, readability was incrementally improved. But now any number of changes–technical capabilities, edits, inserting larger graphics—required re-adjusting all of those manual “optimizations.” The time it took just to do the additional optimization was incredible, much less the maintenance of these optimizations! Meanwhile, the technical writing team was falling behind on new deliverables.

The same scenario applies to your new casual BI users. This new group needs guidance to help them focus on the highest value practices:

  • Customization of color and appearance of visualizations: When is this customization necessary for a management deliverable, versus indulging an OCD tendency? I too have to stop myself from obsessing about the font, line spacing, and that a certain blue is just a bit different than another shade of blue. Yes, these options do matter. But help these casual users determine when that time is well spent.
  • Proper visualizations: When is a spinning 3D pie chart necessary to grab someone’s attention? BI professionals would firmly say “NEVER!” But these casual users do not have a lot of depth on BI best practices. Give them a few simple guidelines as to when “flash” needs to subsume understanding. Consider offering a monthly one-hour Lunch and Learn that shows them how to create impactful, polished visuals. Understanding if their visualizations are going to be viewed casually on the way to a meeting, or dissected at a laptop, also helps determine how much time to spend optimizing a visualization. No, you can’t just mandate that they all read Tufte.
  • Predictive: Provide advanced analytics capabilities like forecasting and regression directly in their casual BI tools. Using these capabilities will really help them wow their audience with substance instead of flash.
  • Feature requests: Make sure you understand the motivation and business value behind some of the casual users’ requests. These casual users are less likely to understand the implications of supporting specific requests across an enterprise, so make sure you are collaborating on use cases and priorities for substantive requests.

By working with your casual BI users on the above points, you will be able to collectively understand when the absolute exact request is critical (and supports good visualization practices), and when it is an “optimization” that may impact productivity. In many cases, “good” is good enough for the fast turnaround of data discovery.

Next week, I’ll wrap this series up with hints on getting your casual users to embrace the “we” not “me” mentality.

Read Part One of this series: Changing The IT Culture For Self-Service BI Success.

Follow me on Twitter: @InaSAP

Comments

Unlock Your Digital Super Powers: How Digitization Helps Companies Be Live Businesses

Erik Marcade and Fawn Fitter

The Port of Hamburg handles 9 million cargo containers a year, making it one of the world’s busiest container ports. According to the Hamburg Port Authority (HPA), that volume doubled in the last decade, and it’s expected to at least double again in the next decade—but there’s no room to build new roads in the center of Hamburg, one of Germany’s historic cities. The port needed a way to move more freight more efficiently with the physical infrastructure it already has.

sap_Q216_digital_double_feature1_images1The answer, according to an article on ZDNet, was to digitize the processes of managing traffic into, within, and back out of the port. By deploying a combination of sensors, telematics systems, smart algorithms, and cloud data processing, the Port of Hamburg now collects and analyzes a vast amount of data about ship arrivals and delays, parking availability, ground traffic, active roadwork, and more. It generates a continuously updated model of current port conditions, then pushes the results through mobile apps to truck drivers, letting them know exactly when ships are ready to drop off or receive containers and optimizing their routes. According to the HPA, they are now on track to handle 25 million cargo containers a year by 2025 without further congestion or construction, helping shipping companies bring more goods and raw materials in less time to businesses and consumers all across Europe.

In the past, the port could only have solved its problem with backhoes and building permits—which, given the physical constraints, means the problem would have been unsolvable. Today, though, software and sensors are allowing it to improve processes and operations to a previously impossible extent. Big Data analysis, data mining, machine learning, artificial intelligence (AI), and other technologies have finally become sophisticated enough to identify patterns not just in terabytes but in petabytes of data, make decisions accordingly, and learn from the results, all in seconds. These technologies make it possible to digitize all kinds of business processes, helping organizations become more responsive to changing market conditions and more able to customize interactions to individual customer needs. Digitization also streamlines and automates these processes, freeing employees to focus on tasks that require a human touch, like developing innovative strategies or navigating office politics.

In short, digitizing business processes is key to ensuring that the business can deliver relevant, personalized responses to the market in real time. And that, in turn, is the foundation of the Live Business—a business able to coordinate multiple functions in order to respond to and even anticipate customer demand at any moment.

Some industries and organizations are on the verge of discovering how business process digitization can help them go live. Others have already started putting it into action: fine-tuning operations to an unprecedented level across departments and at every point in the supply chain, cutting costs while turbocharging productivity, and spotting trends and making decisions at speeds that can only be called superhuman.

Balancing Insight and Action

sap_Q216_digital_double_feature1_images2Two kinds of algorithms drive process digitization, says Chandran Saravana, senior director of advanced analytics at SAP. Edge algorithms operate at the point where customers or other end users interact directly with a sensor, application, or Internet-enabled device. These algorithms, such as speech or image recognition, focus on simplicity and accuracy. They make decisions based primarily on their ability to interpret input with precision and then deliver a result in real time.

Edge algorithms work in tandem with, and sometimes mature into, server-level algorithms, which report on both the results of data analysis and the analytical process itself. For example, the complex systems that generate credit scores assess how creditworthy an individual is, but they also explain to both the lender and the credit applicant why a score is low or high, what factors went into calculating it, and what an applicant can do to raise the score in the future. These server-based algorithms gather data from edge algorithms, learn from their own results, and become more accurate through continuous feedback. The business can then track the results over time to understand how well the digitized process is performing and how to improve it.

sap_Q216_digital_double_feature1_images5From Data Scarcity to a Glut

To operate in real time, businesses need an accurate data model that compares what’s already known about a situation to what’s happened in similar situations in the past to reach a lightning-fast conclusion about what’s most likely to happen next. The greatest barrier to this level of responsiveness used to be a lack of data, but the exponential growth of data volumes in the last decade has flipped this problem on its head. Today, the big challenge for companies is having too much data and not enough time or power to process it, says Saravana.

Even the smartest human is incapable of gathering all the data about a given situation, never mind considering all the possible outcomes. Nor can a human mind reach conclusions at the speed necessary to drive Live Business. On the other hand, carefully crafted algorithms can process terabytes or even petabytes of data, analyze patterns and detect outliers, arrive at a decision in seconds or less—and even learn from their mistakes (see How to Train Your Algorithm).

How to Train Your Algorithm 

The data that feeds process digitization can’t just simmer.
It needs constant stirring.

Successfully digitizing a business process requires you to build a model of the business process based on existing data. For example, a bank creates a customer record that includes not just the customer’s name, address, and date of birth but also the amount and date of the first deposit, the type of account, and so forth. Over time, as the customer develops a history with the bank and the bank introduces new products and services, customer records expand to include more data. Predictive analytics can then extrapolate from these records to reach conclusions about new customers, such as calculating the likelihood that someone who just opened a money market account with a large balance will apply for a mortgage in the next year.

Germany --- Germany, Lower Bavaria, Man training English Springer Spaniel in grass field --- Image by © Roman M‰rzinger/Westend61/CorbisTo keep data models accurate, you have to have enough data to ensure that your models are complete—that is, that they account for every possible predictable outcome. The model also has to push outlying data and exceptions, which create unpredictable outcomes, to human beings who can address their special circumstances. For example, an algorithm may be able to determine that a delivery will fail to show up as scheduled and can point to the most likely reasons why, but it can only do that based on the data it can access. It may take a human to start the process of locating the misdirected shipment, expediting a replacement, and establishing what went wrong by using business knowledge not yet included in the data model.

Indeed, data models need to be monitored for relevance. Whenever the results of a predictive model start to drift significantly from expectations, it’s time to examine the model to determine whether you need to dump old data that no longer reflects your customer base, add a new product or subtract a defunct one, or include a new variable, such as marital status or length of customer relationship that further refines your results.

It’s also important to remember that data doesn’t need to be perfect—and, in fact, probably shouldn’t be, no matter what you might have heard about the difficulty of starting predictive analytics with lower-quality data. To train an optical character recognition system to recognize and read handwriting in real time, for example, your samples of block printing and cursive writing data stores also have to include a few sloppy scrawls so the system can learn to decode them.

On the other hand, in a fast-changing marketplace, all the products and services in your database need consistent and unchanging references, even though outside the database, names, SKUs, and other identifiers for a single item may vary from one month or one order to the next. Without consistency, your business process model won’t be accurate, nor will the results.

Finally, when you’re using algorithms to generate recommendations to drive your business process, the process needs to include opportunities to test new messages and products against existing successful ones as well as against random offerings, Saravana says. Otherwise, instead of responding to your customers’ needs, your automated system will actually control their choices by presenting them with only a limited group of options drawn from those that have already received the most
positive results.

Any process is only as good as it’s been designed to be. Digitizing business processes doesn’t eliminate the possibility of mistakes and problems; but it does ensure that the mistakes and problems that arise are easy to spot and fix.

From Waste to Gold

Organizations moving to digitize and streamline core processes are even discovering new business opportunities and building new digitized models around them. That’s what happened at Hopper, an airfare prediction app firm in Cambridge, Massachusetts, which discovered in 2013 that it could mine its archives of billions of itineraries to spot historical trends in airfare pricing—data that was previously considered “waste product,” according to Hopper’s chief data scientist, Patrick Surry.

Hopper developed AI algorithms to correlate those past trends with current fares and to predict whether and when the price of any given flight was likely to rise or fall. The results were so accurate that Hopper jettisoned its previous business model. “We check up to 3 billion itineraries live, in real time, each day, then compare them to the last three to four years of historical airfare data,” Surry says. “When consumers ask our smartphone app whether they should buy now or wait, we can tell them, ‘yes, that’s a good deal, buy it now,’ or ‘no, we think that fare is too expensive, we predict it will drop, and we’ll alert you when it does.’ And we can give them that answer in less than one second.”

When consumers ask our smartphone app whether they should buy now or wait, we can tell them, ‘yes, that’s a good deal, buy it now’.

— Patrick Surry, chief data scientist, Hopper

While trying to predict airfare trends is nothing new, Hopper has told TechCrunch that it can not only save users up to 40% on airfares but it can also find them the lowest possible price 95% of the time. Surry says that’s all due to Hopper’s algorithms and data models.

The Hopper app launched on iOS in January 2015 and on Android eight months later. The company also switched in September 2015 from directing customers to external travel agencies to taking bookings directly through the app for a small fee. The Hopper app has already been downloaded to more than 2 million phones worldwide.

Surry predicts that we’ll soon see sophisticated chatbots that can start with vague requests from customers like “I want to go somewhere warm in February for less than $500,” proceed to ask questions that help users narrow their options, and finally book a trip that meets all their desired parameters. Eventually, he says, these chatbots will be able to handle millions of interactions simultaneously, allowing a wide variety of companies to reassign human call center agents to the handling of high-value transactions and exceptions to the rules built into the digitized booking process.

Port of Hamburg Lets the Machines Untangle Complexity

In early 2015, AI experts told Wired magazine that at least another 10 years would pass before a computer could best the top human players at Go, an ancient game that’s exponentially harder than chess. Yet before the end of that same year, Wired also reported that machine learning techniques drove Google’s AlphaGo AI to win four games out of five against one of the world’s top Go players. This feat proves just how good algorithms have become at managing extremely complex situations with multiple interdependent choices, Saravana points out.

The Port of Hamburg, which has digitized traffic management for an estimated 40,000 trucks a day, is a good example. In the past, truck drivers had to show up at the port to check traffic and parking message boards. If they arrived before their ships docked, they had to drive around or park in the neighboring residential area, contributing to congestion and air pollution while they waited to load or unload. Today, the HPA’s smartPORT mobile app tracks individual trucks using telematics. It customizes the information that drivers receive based on location and optimizes truck routes and parking in real time so drivers can make more stops a day with less wasted time and fuel.

The platform that drives the smartPORT app also uses sensor data in other ways: it tracks wind speed and direction and transmits the data to ship pilots so they can navigate in and out of the port more safely. It monitors emissions and their impact on air quality in various locations in order to adjust operations in real time for better control over environmental impact. It automatically activates streetlights for vehicle and pedestrian traffic, then switches them off again to save energy when the road is empty. This ability to coordinate and optimize multiple business functions on the fly makes the Port of Hamburg a textbook example of a Live Business.

Digitization Is Not Bounded by Industry

Other retail and B2B businesses of all types will inevitably join the Port of Hamburg in further digitizing processes, both in predictable ways and in those we can only begin to imagine.

sap_Q216_digital_double_feature1_images4Customer service, for example, is likely to be in the vanguard. Automated systems already feed information about customers to online and phone-based service representatives in real time, generate cross-selling and upselling opportunities based on past transactions, and answer customers’ frequently asked questions. Saravana foresees these systems becoming even more sophisticated, powered by AI algorithms that are virtually indistinguishable from human customer service agents in their ability to handle complex live interactions in real time.

In manufacturing and IT, Sven Bauszus, global vice president and general manager for predictive analytics at SAP, forecasts that sensors and predictive analysis will further automate the process of scheduling and performing maintenance, such as monitoring equipment for signs of failure in real time, predicting when parts or entire machines will need replacement, and even ordering replacements preemptively. Similarly, combining AI, sensors, data mining, and other technologies will enable factories to optimize workforce assignments in real time based on past trends, current orders, and changing market conditions.

Public health will be able to go live with technology that spots outbreaks of infectious disease, determines where medical professionals and support personnel are needed most and how many to send, and helps ensure that they arrive quickly with the right medication and equipment to treat patients and eradicate the root cause. It will also make it easier to track communicable illnesses, find people who are symptomatic, and recommend approaches to controlling the spread of the illness, Bauszus says.

He also predicts that the insurance industry, which has already begun to digitize its claims-handling processes, will refine its ability to sort through more claims in less time with greater accuracy and higher customer satisfaction. Algorithms will be better and faster at flagging claims that have a high probability of being fraudulent and then pushing them to claims inspectors for investigation. Simultaneously, the same technology will be able to identify and resolve valid claims in real time, possibly even cutting a check or depositing money directly into the insured person’s bank account within minutes.

Financial services firms will be able to apply machine learning, data mining, and AI to accelerate the process of rating borrowers’ credit and detecting fraud. Instead of filling out a detailed application, consumers might be able to get on-the-spot approval for a credit card or loan after inputting only enough information to be identified. Similarly, banks will be able to alert customers to suspicious transactions by text message or phone call—not within a day or an hour, as is common now, but in a minute or less.

Pitfalls and Possibilities

As intelligent as business processes can be programmed to be, there will always be a point beyond which they have to be supervised. Indeed, Saravana forecasts increasing regulation around when business processes can and can’t be digitized. Especially in areas involving data security, physical security, and health and safety, it’s one thing to allow machines to parse data and arrive at decisions to drive a critical business process, but it’s another thing entirely to allow them to act on those decisions without human oversight.

Automated, impersonal decision making is fine for supply chain automation, demand forecasting, inventory management, and other processes that need faster-than-human response times. In human-facing interactions, though, Saravana insists that it’s still best to digitize the part of the process that generates decisions, but leave it to a human to finalize the decision and decide how to put it into action.

“Any time the interaction is machine-to-machine, you don’t need a human to slow the process down,” he says. “But when the interaction involves a person, it’s much more tricky, because people have preferences, tastes, the ability to try something different, the ability to get fatigued—people are only statistically predictable.”

For example, technology has made it entirely possible to build a corporate security system that can gather information from cameras, sensors, voice recognition technology, and other IP-enabled devices. The system can then feed that information in a steady stream to an algorithm designed to identify potentially suspicious activity and act in real time to prevent or stop it while alerting the authorities. But what happens when an executive stays in the office unusually late to work on a presentation and the security system misidentifies her as an unauthorized intruder? What if the algorithm decides to lock the emergency exits, shut down the executive’s network access, or disable her with a Taser instead of simply sending an alert to the head of security asking what to do while waiting for the police to come?

sap_Q216_digital_double_feature1_images6The Risk Is Doing Nothing

The greater, if less dramatic, risk associated with digitizing business processes is simply failing to pursue it. It’s true that taking advantage of new digital technologies can be costly in the short term. There’s no question that companies have to invest in hardware, software, and qualified staff in order to prepare enormous data volumes for storage and analysis. They also have to implement new data sources such as sensors or Internet-connected devices, develop data models, and create and test algorithms to drive business processes that are currently analog. But as with any new technology, Saravana advises, it’s better to start small with a key use case, rack up a quick win with high ROI, and expand gradually than to drag your heels out of a failure to grasp the long-term potential.

The economy is digitizing rapidly, but not evenly. According to the McKinsey Global Institute’s December 2015 Digital America report, “The race to keep up with technology and put it to the most effective business use is producing digital ‘haves’ and ‘have-mores’—and the large, persistent gap between them is becoming a decisive factor in competition across the economy.” Companies that want to be among the have-mores need to commit to Live Business today. Failing to explore it now will put them on the wrong side of the gap and, in the long run, rack up a high price tag in unrealized efficiencies and missed opportunities. D!

Comments

Erik Marcade

About Erik Marcade

Erik Marcade is vice president of Advanced Analytics Products at SAP.

Tags:

Digital Transformation Needs More Than Technology (Part 2)

Andreas Hauser

In my last blog, I explained why design, design thinking, and experience matter in digital transformation, which goes beyond pure technology and business skills. The credo is to engage with your customers, and most importantly, with the users right from the beginning, in an iterative, user-centric design process.

Digital transformation is more than a one-time project; it is a journey. Ultimately, enterprises want to prepare their organization for a sustainable design-led digital transformation.

But how to achieve this? That is the focus of today’s blog.

Foster a design-led innovation culture

Changing a company culture is not easy. If you don’t want your company ending up like Nokia or Blackberry, you better start sooner than later experiencing new practices. More and more companies are training their people on design thinking and want to establish a culture of design-led innovation.Slide3.JPG

My formula for innovation culture is people + process + place.

You need to have the right skills in your organization (people): Business, technical and design skills to better understand the needs of customers and users. Business and technical skills are typically not the problem. But how many people with design background do you have in your organization? This is why more and more companies start hiring designers and train their employees on design thinking.

It is not just about having the people with the right skills. You also need to change the way (process) how you engage with customer and users. To be successful, you need to combine design thinking with agile methodologies. The process is pretty simple: You need to get people with the right skills together working as one team and iterate from the beginning to the end with the customers and users. Sounds simple, but it is sometimes difficult to execute in large global organizations.

Let me tell you a story.

We have trained about 300 people (business + IT) at one of our customers in design thinking and helped to establish five design thinking coaches in their organization. This was the most interesting outcome: After the exercise, seven out of 10 IT projects were initiated by the business. The customer told us that this was the first time where business proactively wanted to work with IT. This is a great start to improve a relationship which in recent years has gotten separated in many organizations.

In our experience, the place where people work together also has a huge impact on creativity. Therefore, we have established at SAP “customer-facing” co-innovation spaces – called SAP AppHaus – where customers and SAP collaborate and co-innovate as partners. Establishing creative spaces within your organization gives the cultural change a face. Skills and new ways of working are often not very visible. The space is physical and people see and feel that your organization is changing. Check out our virtual walk-through our AppHaus in Heidelberg.

In my last blog, I discussed our co-innovation journey with Mercedes-AMG, which paved Mercedes-AMG’s way to a sustainable digital transformation. Based on my experience, you first need to show with a lighthouse project that this methodology creates business value for the company. You can then build on this success and start the journey to establish a sustainable design-led culture in your organization.

Be prepared for a long journey. It takes time to change and influence the way how organizations work. There has never been a more exciting time for designers, because the industry is starting to see the huge value they can bring to organizations and the value they can bring for a successful digital transformation.

Have a look at a more detailed presentation and a video recording about the concept that “digital transformation is more than technology.” Learn more about customer success stories on the UX Design Services web page.

Stay tuned for more articles as part of this blog series, in which you can explore further perspectives on digital transformation and its various aspects, learn about organizational readiness for design thinking, and assess how ready your organization is to embark on this journey.

Read Part 1 of this discussion: Digital Transformation Needs More Than Technology.

This article originally appeared on SAP Business Trends.

Comments

Andreas Hauser

About Andreas Hauser

Andreas is global head of the design and co-innovation center at SAP. His team drives customer & strategic design projects through Co-Innovation and Design Thinking. Before he was Vice President of User Experience at SAP SE for OnDemand Solutions.