Sections

Setting The Record Straight on SAP HANA

Ken Tsai

visionary.jpgOn Feb 22, 2012 I read a blog from Oracle that creates a lot of FUD around SAP HANA (HANA).  Looks like we have touched a nerve there, as we are seeing the competitive response as a whole lot of trash talk from the competition.  It also means we are doing something right…

Should we even deem this with a response? We had tweets dealing with this blog, all with similar content:


@cochesdiez: #Oracle not telling the truth on #SAP #HANA #Inmemory See my tweets in responding to some simply untrue facts

@jdh2n: @oracleEPMBI @manangoel Might want to check your facts on some of the points. Sounds like marketing flare not technical facts. #sap #HANA


Even though SAP has traditionally taken the high road in these matters and while in keeping with that spirit I’ve decided to join the community and not let this disingenuousness pass. I will try to address the FUD factor and misinformation and set the record straight with facts instead of speculation and misinformation that was rife in that blog. Our customers will do the same for us and in the final analysis that’s what matters.

So here goes…


#1 HANA is first to the market with next-generation innovation in databases

SAP has had ample experience in first generation in-memory technologies with SAP APO, trex, ptime etc.   And yes these technologies and others, including TimesTen, BerkelyDB, and MySQL have been around for years.   SAP HANA is the next generation of in-memory and significantly advanced compared to these technologies – with many innovations and firsts.  Some innovation highlights are

  1. In-memory column and row-store
  2. No disk for operation
  3. Dynamic parallelism
  4. OLTP and OLAP together
  5. Standard SQL on structured and unstructured data
  6. Insert only
  7. Multi-node
  8. Lightning-fast Bulk load (something customers love and saying that this capability alone makes HANA worth it)
  9. Application server AND database server 

One can argue that iPhone was not new, but it was the next generation in “smart phones” and did redefine the category by putting everything together – entire database of songs, moving away from disk to flash, new apps. 

#2 HANA is growing rapidly in adoption and revenue

In 6 short months, we have more than 200 customers and $200 M in software license revenue – it’s a massive uptake and adoption of the product no matter how you slice it.  Our competition is comparing it to their mature cash cow product that has been leading the market for many decades.  Plus they are adding their hardware, software, and maintenance revenues while drawing comparisons.  Not bad for a new innovative product.  If HANA weren’t threatening their hegemony in the space you would not be reading this blog response.  HANA is the innovator and not the incumbent, plus we are focused on software innovation to simplify our customer’s landscape and not intent on selling more expensive hardware, repacking existing products into more expensive appliances and increasing our customers TCO.

#3 HANA is enterprise ready with manageability and reliability. 

HANA is available for scale out as of Nov 7th, its high availability is standard, it is fully ACID compliant, has failover to another standby in-memory system and has persistence to the disk in case of complete failure from which it can replay logs to recover rapidly. HANA schemes can be extended on the fly, no re-indexing needed, e.g. since there are no physical layers, structural changes are easier to implement.  Since this is a modern design with insert only and no updates – antiquated locking mechanisms are not needed and concurrency can be much higher than traditional DBMS such as Oracle.  HANA has been deployed for mission critical applications such as smart meter analytics (Centrica), customer segmentation for marketing offers (T-Mobile) and cost based profitability analysis (Colgate, Ferrero, Honeywell).  Our customers have stood up time and again and have highlighted how HANA has helped them deliver new business value while maintaining the enterprise manageability and reliability that SAP is known for. 

#4: HANA is non disruptive with plug and play on layers above and below.

HANA is open and offers customers real choice.  HANA is ANSI SQL compliant, that why companies like Medtronic utilized ERDs built on Sybase Power designer and have it deployed on HANA in no time.   HANA supports standard MDX interfaces.  In addition to SAP products (SAP BusinessObjects, SuccessFactors), BI products on top such as Tableau, data visualization software like TIBCO Spotfire, Salesforce analytics, MSFT clients like Excel and collaborative platforms like Jive, have all been tested to work with HANA.  Even competitive applications such as UFIDA are looking at HANA.  Oracle Exalytics in comparison is closed and packages in Oracle BI products only. Also, it works only on Sun, while HANA is plug and play with 8 hardware vendors.   SAP’s open ecosystem across hardware partners and partners building applications powered by HANA (coming soon) further confirm these facts.   Some more fun facts – HANA runs apps written for Oracle DB without any changes, Given Oracle says that Exadata runs 10x faster than its traditional RDBMS and HANA runs 1000x faster than Oracle RDBMS in customer situations on an average – one can approximate that apps on HANA run 100x faster than on Exadata.

#5: HANA is available as an appliance with proven ease of deployment

SAP provides level 1 support for SAP HANA irrespective of hardware or software issues. SAP HANA appliance configurations are pre-certified providing customers complete assurance on software + hardware design in addition to bringing open innovation, choice and cost competition from multiple hardware providers. SAP is going further by providing rapidly deployed solutions on top of SAP HANA appliance such as SAP CO-PA Accelerator – by using this RDS customers like Provimi have gone live in less than 3 weeks. That’s what we call ease of use and deployment.  Keep in mind that HANA runs on Intel architecture provided by 8 vendors (turnkey) on one operating system.  HANA runs both OLTP and OLAP on this single architecture greatly reducing TCO – and installs and runs in record time – all further underscoring ease of use and deployment

I do believe that this is an inflection point and companies need to rethink their data management strategy. Should they go with aging, and mature DBMS or bring in new innovation to deal with modern day challenges for doing real-time analysis on extremely large data sets, without pre-thinking the questions or queries and without prefabrication and tuning the old systems like Oracle DBMS need? 

Come to a SAP Forum in or near your city and you can see, touch, and feel HANA live in action.  Better yet – if you are looking for more info – including test driving HANA yourself – go to Experience SAP HANA for the real deal – from basic information to deep technical info on SAP HANA from practitioners, product managers, partners, and customers.  Check it out for yourself!

Comments

Ken Tsai

About Ken Tsai

Ken Tsai is the VP of Head of Cloud Platform & Data Management at SAP Product Marketing. He leads global product marketing responsibility for SAP's platform-as-a-service and end-to-end data management solutions. Drove new market opportunity discovery, pipeline acceleration and external thought leadership for SAP's platform solution group (SAP HANA Cloud Platform, SAP HANA platform, SAP HANA Vora, SAP ASE, SAP IQ, SAP SQLAnywhere, SAP ESP).

Tags:

awareness , News

Smart Environmental Monitoring: Keeping An Eye On Air Quality With More Data

Frank Wittmann

The Port of Hamburg records over 1.5 million measurement readings a day. Computer scientists from Potsdam’s Hasso Plattner Institute (HPI) are using in-memory technology to show how these vast volumes of data can help reduce emissions. The software solution will be presented for the first time at SAPPHIRE NOW.

Ports continue to be one of the most important transport hubs for international trade. Around 10,000 maritime vessels a year pass through Hamburg’s port alone, constituting Europe’s third-highest container handling volume. In doing so, they not only leave behind goods, but increasingly also data. Over 1.5 million environmental, position and traffic measurement readings are recorded by the Hamburg Port Authority (HPA) every day. Computer scientists from the Hasso Plattner Institute are working on using these huge quantities of data to create models for air quality in the port area.

The Potsdam researchers have developed software designed to enable the port of Hamburg to track air quality data and trends at any time.

“Our software application enables these different data sources to be linked together. Furthermore, the vast amounts of data can be visualized in order to give a comprehensive picture of the port facility,” explains Dr Matthias Uflacker, deputy chair of the HPI faculty for Enterprise Platforms and Integration Concepts. He adds that air quality analyses nowadays have a long-term focus, and are very static, whereas the HPI software seeks to enable dynamic analyses in real time.

The software can be used to calculate and track sulphur dioxide readings, nitrogen dioxide readings, particulate matter readings, the positions of various ships types and traffic flows on vehicle routes on an interactive map of the port area.

“For the first time ever, emissions can thus be approximately calculated and displayed for specific time frames and regions in a matter of seconds,” says Uflacker. These analyses could ultimately help experts implement emission-reducing measures.

There are a wide range of possible applications – for example, in the usage and development of environmentally friendly external power infrastructure. When ships dock in a port, they have to keep their auxiliary machinery running in order to supply power to the onboard electronics systems. Exhaust gases are therefore produced even during downtimes – something which can be heavily reduced using land-based power systems.

“The software will be able, for instance, to help determine the optimum location for the environmentally friendly power sources,” explains Ulrich Baldauf, head of IT strategy at the HPA.

The application created by Uflacker’s Potsdam team is based on in-memory database technology co-developed at the HPI, and is a focus area of current research. Anyone wanting to try this Potsdam-produced software for themselves can do so in Orlando at SAPPHIRE NOW from May 17-19, where the emission-analysis software will be launched. The HPI will also be presenting selected projects on in-memory database technology and innovative online learning formats for universities and companies at stand 100.

Frank Wittmann is online marketing manager for Hasso Plattner Institute.

SAPPHIRE NOW + ASUG ANNUAL CONFERENCE, May 17-19, 2016: Live from the event, watch keynotes, strategic sessions, press conferences, and more. Online and on-demand. Learn more here.

 

Comments

Frank Wittmann

About Frank Wittmann

Frank Wittmann is the online marketing manager for the Hasso Plattner Institute. His specialties include online marketing, social media marketing, web content management, online publishing and online communications.

How Big Data Drives HR In 2016

Meghan M. Biro

Big Data has helped earn human resources a seat at the table (or so we hope, as we move beyond buzzword phrases)—an active part of the business planning process, supported by the deep insights and predictive analytics that this gold mine of information provides.

But a lot of HR departments are still trying to figure out what to do with all that data. A report from Oxford Economics and SAP found that few HR departments are without business analytics altogether, but many still struggle to use what they have in a way that matters.

There’s good news for the coming year. As our ability to analyze and interpret Big Data matures, new tools are hitting the marketplace while existing ones are getting smarter. Here’s a look at how Big Data will drive HR this year, and the biggest trends you need to know about.

Watch for these Big Data trends

Technology will never take the place of a highly skilled HR professional, but it can validate decisions and streamline operations—in real time. Companies who take an interest in these trends early on may be able to leverage them in the marketplace.

  1. Vanity metrics—stats that look good but offer little meaningful insight—are fading away. Quality trumps quantity when it comes to data sets, and the application of metrics matters far more than in the past. As companies attract more data analysts and train employees to use analytics programs, teams are focusing more on the strategic use behind how and what they collect.
  1. Predictive analytics are getting smarter. Predictive analytics can be a powerful tool for business as a whole, and the programs available are finally stepping up their game. While they can provide insights into employee benefits, promotions, and talent management, predictive analytics are starting to be used for deeper forecasting. For example, they can help measure the efficacy of training courses, or to identify which employees are more likely to reach their targets and why.
  1. Analytics tools are getting simpler—and more affordable. One thing that’s held the rise of analytics back is the fact that some companies can’t afford a full suite of tools, while others find the applications they have don’t always uncover the information they want. But new options are on the horizon. Companies such as Dell and Oracle have embraced an open source approach to HR and recruiting. More options will fuel the use of analytics across organizations of all sizes.
  1. You can put a value on human capital. Organizations often claim that human capital is one of the most important business assets companies have, but they have a difficult time backing up that statement with data. With analytics, companies can assign financial values to individual tasks and better understand the financial impact of every person in their organization, which has potential implications for recruitment, benefits, and talent retention.
  1. Sensors offer a whole new perspective. There are new ways to collect data—from internal monitoring systems, online listening platforms, or even the growing Internet of Things (IoT)—and use it for on-the-floor insights. In industries such as manufacturing and farming, sensor-driven data can provide information on machine or crop performance. But it can also impact HR responsibilities: For example, Honeywell and Intel recently introduced a prototype for sensors that monitor worker safety. If HR departments can identify warning signs or other real-time data signals, they can find new ways to improve regulation compliance and worker safety.
  1. Data analysts are in high demand. CNBC called it “the sexiest job of the 21st century,” and it’s definitely one of the hottest jobs out there. It takes a skilled data analyst to understand how to massage and extract data and produce actionable reports. Not surprisingly, they’re in relatively short supply. Organizations will need to get creative to find the talent they need to meet their analytics needs.

Big data has the potential to improve every aspect of business—if companies are willing to take the time and effort to figure out how. The right data-focused talent and tools can transform an organization. The opportunity is there for big data to drive HR; you just need to take advantage of it.

For more insight on how advanced tech is transforming HR, see The Internet of Things Will Fundamentally Change HR

The post How Big Data Drives HR in 2016 appeared first on TalentCulture.

Photo Credit: jonahengler via Compfight cc

Comments

Unlock Your Digital Super Powers: How Digitization Helps Companies Be Live Businesses

Erik Marcade and Fawn Fitter

The Port of Hamburg handles 9 million cargo containers a year, making it one of the world’s busiest container ports. According to the Hamburg Port Authority (HPA), that volume doubled in the last decade, and it’s expected to at least double again in the next decade—but there’s no room to build new roads in the center of Hamburg, one of Germany’s historic cities. The port needed a way to move more freight more efficiently with the physical infrastructure it already has.

sap_Q216_digital_double_feature1_images1The answer, according to an article on ZDNet, was to digitize the processes of managing traffic into, within, and back out of the port. By deploying a combination of sensors, telematics systems, smart algorithms, and cloud data processing, the Port of Hamburg now collects and analyzes a vast amount of data about ship arrivals and delays, parking availability, ground traffic, active roadwork, and more. It generates a continuously updated model of current port conditions, then pushes the results through mobile apps to truck drivers, letting them know exactly when ships are ready to drop off or receive containers and optimizing their routes. According to the HPA, they are now on track to handle 25 million cargo containers a year by 2025 without further congestion or construction, helping shipping companies bring more goods and raw materials in less time to businesses and consumers all across Europe.

In the past, the port could only have solved its problem with backhoes and building permits—which, given the physical constraints, means the problem would have been unsolvable. Today, though, software and sensors are allowing it to improve processes and operations to a previously impossible extent. Big Data analysis, data mining, machine learning, artificial intelligence (AI), and other technologies have finally become sophisticated enough to identify patterns not just in terabytes but in petabytes of data, make decisions accordingly, and learn from the results, all in seconds. These technologies make it possible to digitize all kinds of business processes, helping organizations become more responsive to changing market conditions and more able to customize interactions to individual customer needs. Digitization also streamlines and automates these processes, freeing employees to focus on tasks that require a human touch, like developing innovative strategies or navigating office politics.

In short, digitizing business processes is key to ensuring that the business can deliver relevant, personalized responses to the market in real time. And that, in turn, is the foundation of the Live Business—a business able to coordinate multiple functions in order to respond to and even anticipate customer demand at any moment.

Some industries and organizations are on the verge of discovering how business process digitization can help them go live. Others have already started putting it into action: fine-tuning operations to an unprecedented level across departments and at every point in the supply chain, cutting costs while turbocharging productivity, and spotting trends and making decisions at speeds that can only be called superhuman.

Balancing Insight and Action

sap_Q216_digital_double_feature1_images2Two kinds of algorithms drive process digitization, says Chandran Saravana, senior director of advanced analytics at SAP. Edge algorithms operate at the point where customers or other end users interact directly with a sensor, application, or Internet-enabled device. These algorithms, such as speech or image recognition, focus on simplicity and accuracy. They make decisions based primarily on their ability to interpret input with precision and then deliver a result in real time.

Edge algorithms work in tandem with, and sometimes mature into, server-level algorithms, which report on both the results of data analysis and the analytical process itself. For example, the complex systems that generate credit scores assess how creditworthy an individual is, but they also explain to both the lender and the credit applicant why a score is low or high, what factors went into calculating it, and what an applicant can do to raise the score in the future. These server-based algorithms gather data from edge algorithms, learn from their own results, and become more accurate through continuous feedback. The business can then track the results over time to understand how well the digitized process is performing and how to improve it.

sap_Q216_digital_double_feature1_images5From Data Scarcity to a Glut

To operate in real time, businesses need an accurate data model that compares what’s already known about a situation to what’s happened in similar situations in the past to reach a lightning-fast conclusion about what’s most likely to happen next. The greatest barrier to this level of responsiveness used to be a lack of data, but the exponential growth of data volumes in the last decade has flipped this problem on its head. Today, the big challenge for companies is having too much data and not enough time or power to process it, says Saravana.

Even the smartest human is incapable of gathering all the data about a given situation, never mind considering all the possible outcomes. Nor can a human mind reach conclusions at the speed necessary to drive Live Business. On the other hand, carefully crafted algorithms can process terabytes or even petabytes of data, analyze patterns and detect outliers, arrive at a decision in seconds or less—and even learn from their mistakes (see How to Train Your Algorithm).

How to Train Your Algorithm 

The data that feeds process digitization can’t just simmer.
It needs constant stirring.

Successfully digitizing a business process requires you to build a model of the business process based on existing data. For example, a bank creates a customer record that includes not just the customer’s name, address, and date of birth but also the amount and date of the first deposit, the type of account, and so forth. Over time, as the customer develops a history with the bank and the bank introduces new products and services, customer records expand to include more data. Predictive analytics can then extrapolate from these records to reach conclusions about new customers, such as calculating the likelihood that someone who just opened a money market account with a large balance will apply for a mortgage in the next year.

Germany --- Germany, Lower Bavaria, Man training English Springer Spaniel in grass field --- Image by © Roman M‰rzinger/Westend61/CorbisTo keep data models accurate, you have to have enough data to ensure that your models are complete—that is, that they account for every possible predictable outcome. The model also has to push outlying data and exceptions, which create unpredictable outcomes, to human beings who can address their special circumstances. For example, an algorithm may be able to determine that a delivery will fail to show up as scheduled and can point to the most likely reasons why, but it can only do that based on the data it can access. It may take a human to start the process of locating the misdirected shipment, expediting a replacement, and establishing what went wrong by using business knowledge not yet included in the data model.

Indeed, data models need to be monitored for relevance. Whenever the results of a predictive model start to drift significantly from expectations, it’s time to examine the model to determine whether you need to dump old data that no longer reflects your customer base, add a new product or subtract a defunct one, or include a new variable, such as marital status or length of customer relationship that further refines your results.

It’s also important to remember that data doesn’t need to be perfect—and, in fact, probably shouldn’t be, no matter what you might have heard about the difficulty of starting predictive analytics with lower-quality data. To train an optical character recognition system to recognize and read handwriting in real time, for example, your samples of block printing and cursive writing data stores also have to include a few sloppy scrawls so the system can learn to decode them.

On the other hand, in a fast-changing marketplace, all the products and services in your database need consistent and unchanging references, even though outside the database, names, SKUs, and other identifiers for a single item may vary from one month or one order to the next. Without consistency, your business process model won’t be accurate, nor will the results.

Finally, when you’re using algorithms to generate recommendations to drive your business process, the process needs to include opportunities to test new messages and products against existing successful ones as well as against random offerings, Saravana says. Otherwise, instead of responding to your customers’ needs, your automated system will actually control their choices by presenting them with only a limited group of options drawn from those that have already received the most
positive results.

Any process is only as good as it’s been designed to be. Digitizing business processes doesn’t eliminate the possibility of mistakes and problems; but it does ensure that the mistakes and problems that arise are easy to spot and fix.

From Waste to Gold

Organizations moving to digitize and streamline core processes are even discovering new business opportunities and building new digitized models around them. That’s what happened at Hopper, an airfare prediction app firm in Cambridge, Massachusetts, which discovered in 2013 that it could mine its archives of billions of itineraries to spot historical trends in airfare pricing—data that was previously considered “waste product,” according to Hopper’s chief data scientist, Patrick Surry.

Hopper developed AI algorithms to correlate those past trends with current fares and to predict whether and when the price of any given flight was likely to rise or fall. The results were so accurate that Hopper jettisoned its previous business model. “We check up to 3 billion itineraries live, in real time, each day, then compare them to the last three to four years of historical airfare data,” Surry says. “When consumers ask our smartphone app whether they should buy now or wait, we can tell them, ‘yes, that’s a good deal, buy it now,’ or ‘no, we think that fare is too expensive, we predict it will drop, and we’ll alert you when it does.’ And we can give them that answer in less than one second.”

When consumers ask our smartphone app whether they should buy now or wait, we can tell them, ‘yes, that’s a good deal, buy it now’.

— Patrick Surry, chief data scientist, Hopper

While trying to predict airfare trends is nothing new, Hopper has told TechCrunch that it can not only save users up to 40% on airfares but it can also find them the lowest possible price 95% of the time. Surry says that’s all due to Hopper’s algorithms and data models.

The Hopper app launched on iOS in January 2015 and on Android eight months later. The company also switched in September 2015 from directing customers to external travel agencies to taking bookings directly through the app for a small fee. The Hopper app has already been downloaded to more than 2 million phones worldwide.

Surry predicts that we’ll soon see sophisticated chatbots that can start with vague requests from customers like “I want to go somewhere warm in February for less than $500,” proceed to ask questions that help users narrow their options, and finally book a trip that meets all their desired parameters. Eventually, he says, these chatbots will be able to handle millions of interactions simultaneously, allowing a wide variety of companies to reassign human call center agents to the handling of high-value transactions and exceptions to the rules built into the digitized booking process.

Port of Hamburg Lets the Machines Untangle Complexity

In early 2015, AI experts told Wired magazine that at least another 10 years would pass before a computer could best the top human players at Go, an ancient game that’s exponentially harder than chess. Yet before the end of that same year, Wired also reported that machine learning techniques drove Google’s AlphaGo AI to win four games out of five against one of the world’s top Go players. This feat proves just how good algorithms have become at managing extremely complex situations with multiple interdependent choices, Saravana points out.

The Port of Hamburg, which has digitized traffic management for an estimated 40,000 trucks a day, is a good example. In the past, truck drivers had to show up at the port to check traffic and parking message boards. If they arrived before their ships docked, they had to drive around or park in the neighboring residential area, contributing to congestion and air pollution while they waited to load or unload. Today, the HPA’s smartPORT mobile app tracks individual trucks using telematics. It customizes the information that drivers receive based on location and optimizes truck routes and parking in real time so drivers can make more stops a day with less wasted time and fuel.

The platform that drives the smartPORT app also uses sensor data in other ways: it tracks wind speed and direction and transmits the data to ship pilots so they can navigate in and out of the port more safely. It monitors emissions and their impact on air quality in various locations in order to adjust operations in real time for better control over environmental impact. It automatically activates streetlights for vehicle and pedestrian traffic, then switches them off again to save energy when the road is empty. This ability to coordinate and optimize multiple business functions on the fly makes the Port of Hamburg a textbook example of a Live Business.

Digitization Is Not Bounded by Industry

Other retail and B2B businesses of all types will inevitably join the Port of Hamburg in further digitizing processes, both in predictable ways and in those we can only begin to imagine.

sap_Q216_digital_double_feature1_images4Customer service, for example, is likely to be in the vanguard. Automated systems already feed information about customers to online and phone-based service representatives in real time, generate cross-selling and upselling opportunities based on past transactions, and answer customers’ frequently asked questions. Saravana foresees these systems becoming even more sophisticated, powered by AI algorithms that are virtually indistinguishable from human customer service agents in their ability to handle complex live interactions in real time.

In manufacturing and IT, Sven Bauszus, global vice president and general manager for predictive analytics at SAP, forecasts that sensors and predictive analysis will further automate the process of scheduling and performing maintenance, such as monitoring equipment for signs of failure in real time, predicting when parts or entire machines will need replacement, and even ordering replacements preemptively. Similarly, combining AI, sensors, data mining, and other technologies will enable factories to optimize workforce assignments in real time based on past trends, current orders, and changing market conditions.

Public health will be able to go live with technology that spots outbreaks of infectious disease, determines where medical professionals and support personnel are needed most and how many to send, and helps ensure that they arrive quickly with the right medication and equipment to treat patients and eradicate the root cause. It will also make it easier to track communicable illnesses, find people who are symptomatic, and recommend approaches to controlling the spread of the illness, Bauszus says.

He also predicts that the insurance industry, which has already begun to digitize its claims-handling processes, will refine its ability to sort through more claims in less time with greater accuracy and higher customer satisfaction. Algorithms will be better and faster at flagging claims that have a high probability of being fraudulent and then pushing them to claims inspectors for investigation. Simultaneously, the same technology will be able to identify and resolve valid claims in real time, possibly even cutting a check or depositing money directly into the insured person’s bank account within minutes.

Financial services firms will be able to apply machine learning, data mining, and AI to accelerate the process of rating borrowers’ credit and detecting fraud. Instead of filling out a detailed application, consumers might be able to get on-the-spot approval for a credit card or loan after inputting only enough information to be identified. Similarly, banks will be able to alert customers to suspicious transactions by text message or phone call—not within a day or an hour, as is common now, but in a minute or less.

Pitfalls and Possibilities

As intelligent as business processes can be programmed to be, there will always be a point beyond which they have to be supervised. Indeed, Saravana forecasts increasing regulation around when business processes can and can’t be digitized. Especially in areas involving data security, physical security, and health and safety, it’s one thing to allow machines to parse data and arrive at decisions to drive a critical business process, but it’s another thing entirely to allow them to act on those decisions without human oversight.

Automated, impersonal decision making is fine for supply chain automation, demand forecasting, inventory management, and other processes that need faster-than-human response times. In human-facing interactions, though, Saravana insists that it’s still best to digitize the part of the process that generates decisions, but leave it to a human to finalize the decision and decide how to put it into action.

“Any time the interaction is machine-to-machine, you don’t need a human to slow the process down,” he says. “But when the interaction involves a person, it’s much more tricky, because people have preferences, tastes, the ability to try something different, the ability to get fatigued—people are only statistically predictable.”

For example, technology has made it entirely possible to build a corporate security system that can gather information from cameras, sensors, voice recognition technology, and other IP-enabled devices. The system can then feed that information in a steady stream to an algorithm designed to identify potentially suspicious activity and act in real time to prevent or stop it while alerting the authorities. But what happens when an executive stays in the office unusually late to work on a presentation and the security system misidentifies her as an unauthorized intruder? What if the algorithm decides to lock the emergency exits, shut down the executive’s network access, or disable her with a Taser instead of simply sending an alert to the head of security asking what to do while waiting for the police to come?

sap_Q216_digital_double_feature1_images6The Risk Is Doing Nothing

The greater, if less dramatic, risk associated with digitizing business processes is simply failing to pursue it. It’s true that taking advantage of new digital technologies can be costly in the short term. There’s no question that companies have to invest in hardware, software, and qualified staff in order to prepare enormous data volumes for storage and analysis. They also have to implement new data sources such as sensors or Internet-connected devices, develop data models, and create and test algorithms to drive business processes that are currently analog. But as with any new technology, Saravana advises, it’s better to start small with a key use case, rack up a quick win with high ROI, and expand gradually than to drag your heels out of a failure to grasp the long-term potential.

The economy is digitizing rapidly, but not evenly. According to the McKinsey Global Institute’s December 2015 Digital America report, “The race to keep up with technology and put it to the most effective business use is producing digital ‘haves’ and ‘have-mores’—and the large, persistent gap between them is becoming a decisive factor in competition across the economy.” Companies that want to be among the have-mores need to commit to Live Business today. Failing to explore it now will put them on the wrong side of the gap and, in the long run, rack up a high price tag in unrealized efficiencies and missed opportunities. D!

Comments

Erik Marcade

About Erik Marcade

Erik Marcade is vice president of Advanced Analytics Products at SAP.

Tags:

The New Digital Healthcare Patient Experience

Martin Kopp

Digitized healthcare has arrived. And it is only going to get better. Since the 1950s, information technology has had a growing influence on the healthcare industry. And today, more than three-quarters of all patients expect to use digital services in the future. That is, if they are not using them already. Healthcare consumers have become more informed and proactive.

Today, a pregnant woman can schedule a gynecology appointment electronically. Her insurance company probably offers a smartphone app to monitor her health. She can download the app and self-register. The app documents her ongoing health as she updates the profile data. And because her data is stored in the cloud, her gynecologist has immediate access to it.

These are a few examples of the important trends shaping the patient experience with digital innovation. The latest digital solutions are bringing the patient and the healthcare industry closer together. And this digital connectivity means more personalized patient care.

Digital technology is changing the role of the patient. Patients are better informed and more involved in their own health decisions. With greater access to information, they can sometimes self-diagnose certain health issues. Due to digitization, they have better communication with healthcare providers and easier access to their own test results.

Monitoring illness

Healthcare providers are better equipped to gather and analyze data. So, healthcare outcomes are faster and easier to realize. Providers can react earlier to conditions. And they can even sometimes predict medical conditions before any symptoms appear. Therapies are transforming to a more user-centric design. This is all possible because digital networking of data informs caregivers earlier and keeps them informed. We have moved past the patient’s chart as the most important source of information.

Improving wellness

The ability to predict medical conditions gives providers a tool to promote wellness. This is changing the healthcare value chain. Remote monitoring is possible, making trips to the clinic or doctor’s office less necessary. Wearable monitoring devices have changed the medial landscape. And the use of wearable devices is expected to grow. According to the McKinsey Global Institute (MGI), 1.3 billion people will be using fitness trackers by the year 2025. In some regions, this will account for up to 56% of the population. The millennial generation sums up the benefits in a word: convenience.

The blending of physical and digital realms into a common reality is referred to as the Internet of Things (IoT). The IoT makes many things possible that were only dreamed of a few years ago. It extends the reach of information technology. From remote locations, we can electronically monitor and control things in the physical world. Basically, it is the digitizing of the physical world.

With the IoT, MGI predicts a savings in healthcare treatment costs of up to $470 billion per year by 2025. But even more important is the improvement in healthcare. In addition to driving down treatment costs, this will extend healthy life spans and improve the quality of life for millions of people. And it will improve access to healthcare for those who are underserved in the present system. Plus, this extensive use of fitness tracking devices will create a multi-billion dollar industry.

Re-shaping the patient experience

The patients of today and tomorrow have more information and more options than ever before. Patients are already seeing increased value from the Big Data that healthcare professionals now have access to. Patients are more engaged in their own care. We are entering an age of personalized healthcare based on far-reaching knowledge bases.

Because of digital innovation, healthcare consumers can more easily seek relief when they are sick. They can be more involved in disease prevention and self-supported care. With patient-owned medical devices, they are connected to the Big Data of cloud computing. This cloud-based information provides proven treatments and outcomes for specific conditions.

Value chain improvements

The digital value network connects all aspects of the healthcare ecosystem in real time. This connectivity drives better healthcare outcomes that are specifically relevant to the patient. Digital innovation in healthcare improves interactions to provide personalized care based on Big Data. In that respect, you can think of it as Big Medicine for the little guy. A massive database gives healthcare providers a 360-degree view of the patient. Data is stored in the cloud and processed in the core platform.

Services and functions that this efficient system provides include medication reminders for patients. It tracks your health for you, your family, and friends. Remote home monitoring and emergency detection offer an increased level of safety and protection. Remote diagnostics can mean you stay at home instead of being hospitalized. Prediction of organ or other physical failures before they happen can save lives.

SAP software provides a single platform that brings together healthcare providers, patients, and value-added services. It offers a seamless digitization of the entire patient experience. And it provides results in real time, available to all parts of the healthcare ecosystem. This broad connectivity creates an omni-channel, end-to-end patient experience.

To learn more about Digital Transformation for Healthcare, click here.

Comments

Martin Kopp

About Martin Kopp

Martin Kopp is the global general manager for Healthcare at SAP. He is responsible for setting the strategy and articulating the vision and direction of SAP's healthcare-provider industry solutions, influencing product development, and fostering executive level relationships key customers, IT influencers, partners, analysts, and media.