Sections

Help Wanted: What Good Is Job Creation If We Can’t Find People With The Right Skills?

Susan Galer

What Good Is Job Creation If We Can’t Find People With The Right Skills?

According to the International Labour Organisation, the world needs to create over 500 million new jobs by 2020 to provide career opportunities for people who have jobs now, as well as youths who will join the workforce.

But the biggest roadblock to economic growth may well be finding workers with the necessary skills and training to get those jobs done. The European Centre for the Development of Vocational Training projects that demand for highly-qualified people will increase by 16 million between now and 2020, while demand for low-skilled workers will plummet by around 12 million.

Indeed, by 2015 the European Commission estimates a shortfall of up to 900,000 information and communications technology (ICT) professionals across Europe alone. The skills gap is especially acute in emerging countries like Africa and the Middle East with youth unemployment rates from 11 percent to almost 28 percent.

Training the workforce of the future may seem like a daunting task. However, for companies like SAP, the future literally depends on people who understand technology innovations and can put them to work for not just for the company itself, but also customers and partners.

That’s why SAP has recently scaled up its partnerships with local start-ups, small and mid-size businesses, and schools across Europe, Africa, and the Middle East to give people of all ages the qualifications they need to make good on innovations such as in-memory computing, mobile, and cloud. Highlights from just two of the latest program components illustrate the incredible scope of this undertaking which SAP calls the “EMEA Workforce of the Future.”

The Academy Cube is an online eLearning platform slated for initial roll-out in Spain, Greece, and Portugal. The goal is to train over 100,000 graduates and job seekers in Europe. Here’s an excerpt from a recent blog that describes the program:

“At the heart of the Academy Cube initiative is a cloud-based internet platform that companies and institutions can use to provide e-learning courses and post job offerings. People looking for work can use the platform to get the skills and the qualifications high-tech jobs require. And potential employers know what young talents they’re potentially hiring.”

Skills for Africa will train workers initially in South Africa, Angola, Nigeria, Kenya, and Senegal supported by 56 partners to help cover the vast region. Announced during co-CEO Jim Hagemann Snabe’s recent trip to Johannesburg, the hybrid classroom and e-learning approach takes into account that internet access isn’t a given, and focuses on key industries including public and financial sectors, utilities, and oil and gas. Students receive training kits with printed course materials and on an encrypted USB in English, French, and Portuguese.

SAP’s long-time commitment to sustainable business also encompasses volunteers who teach science, technology, engineering, math, and entrepreneurship in schools. In addition, SAP trains unemployed people in Spain, Germany, Switzerland, and BeLux. In 2012, 16,800 workers were trained and 73 percent found a job immediately. To pay off on the promise of its innovations, the IT industry at-large needs even more programs like these to help close the gap between would-be employees and opportunities.

Comments

Tags:

awareness

43 Facts On Purpose And Sustainability In The Digital Economy

Peter Johnson

Part 5 of the six-part blog series “Facts on the Future of Business

Innovation in the business world is accelerating exponentially, with new disruptive technologies and trends emerging that are fundamentally changing how businesses and the global economy operate. To adapt, thrive, and innovate, we all need to be aware of these evolutionary technologies and trends and understand the opportunities or threats they might present to our organizations, our careers, and society on a whole.

With this in mind, I recently had the opportunity to compile 99 Facts on the Future of Business in the Digital Economy. This presentation includes facts, predictions, and research findings on some of the most impactful technologies and trends that are driving the future of business in the digital economy.

To help simplify your ability to find facts for specific topics, I have grouped the facts into six subsets, and below is the fifth of these:

Earth

With the coming and going of ice ages over the last 400,000+ years, CO2 in the earth’s atmosphere fluctuated between 180 ppm and 300 ppm. However, CO2 levels have skyrocketed and now exceed 400 ppm for the first time in recorded history.

Source: “The Relentless Rise of Carbon Dioxide,” NASA Global Climate Change: Vital Signs of the Planet.

New digital technologies can enable a 20% reduction in global carbon emissions by 2030. This is equivalent to eliminating 100% of China’s CO2 emissions, plus another 1.5 billion tons.

Source: “SMARTer2030: Australian Opportunity for ICT Enabled Emission Reductions,” Telstra Corporation.

In the last two decades, 9.6% of the earth’s total wilderness areas has been lost, an estimated 3.3 million square kilometers.

Source: “Catastrophic Declines in Wilderness Areas Undermine Global Environment Targets,” Current Biology.

Many Latin American governments are turning to artificial intelligence to aid in their forest conservation efforts.

Source: “10 Innovations That Are Changing Conservation,” Cool Green Science.

Air pollution continues to rise at an alarming rate, and now 92% of the world population is exposed to air pollution above WHO air quality guidelines.

Source: “Ambient Air Pollution: A Global Assessment of Exposure and Burden of Disease,” World Health Organization.

Every year, nearly 600,000 children under the age of five die from diseases caused or exacerbated by the effects of air pollution.

Source: “Clear the Air for Children,” United Nations Children’s Fund (UNICEF).

Inequity

GDP per capita has increased roughly 1,000% since the 1970s.

Source: “GDP Per Capita,” The World Bank.

CEO pay has risen 1,000% over the past 40-plus years.

Source: “World Economic Forum Annual Meeting 2017: Responsive and Responsible Leadership,” World Economic Forum.

But average worker pay has increased just 11% since the 1970s, essentially stagnating over the past 40-plus years.

Source: “The Productivity–Pay Gap,” Economic Policy Institute.

If ordinary citizens don’t have incomes to buy products made by corporations, how can those corporations prosper? The IMF found countries with less inequality perform better.

Source: “Nobel Economist: One-Percenters, Pay Your Taxes,” CNN.

Although GDP growth is an indicator of progress, it has concealed growing inequality. Economies need a balanced scorecard that also assesses and prioritizes quality of life across the population.

Source: “An Economy for the 99%,” Oxfam International.

By broadly addressing gender equity at work and in society, the world could add $12 trillion to annual gross domestic product in 2025.

Source: “Realizing Gender Equality’s $12 Trillion Economic Opportunity,” McKinsey Global Institute.

The 43 public companies in the DiversityInc Top 50 were 24% more profitable than the S&P 500 average.

Source: “Cultural Diversity in the Workplace: How Diversity at Work Makes More Money for You,” The Balance.

Migrants make up just 3.4% of the world’s population, but they contribute nearly 10% of global GDP. Today, immigrants earn 20% to 30% less than native workers, but if countries narrow this wage gap by just five to 10%, they could generate an additional $1 trillion in global economic output.

Source: “Global Migration’s Impact and Opportunity,” McKinsey Global Institute.

Improving lives

In initial tests, a machine-learning algorithm created at Carnegie Mellon was able to predict heart attacks four hours in advance, with 80% accuracy.

Source: “Of Prediction and Policy,” The Economist.

Artificial intelligence can predict where epidemics will happen. AIME developed a platform with 87% accuracy in predicting dengue fever outbreaks three months in advance. Now they hope to similarly target other diseases, such as Ebola and Zika.

Source: “Artificial Intelligence Innovation Report,” Deloitte.

An estimated 45.8 million people are trapped in some form of slavery in 167 countries.

Source: “Global Findings,” Walk Free Foundation: The Global Slavery Index.

Advanced analytics and Big Data are enabling coordinated efforts to combat human trafficking networks and rapid responses when victims are located.

Source: “Tracing a Web of Destruction: Can Big Data Fight Human Trafficking?” HBS Digital Initiative.

Two billion individuals and 200 million small businesses in emerging economies lack access to basic financial services and credit. Broad adoption of mobile banking in developing nations could create 95 million new jobs and increase GDP by $3.7 trillion by 2025.

Source: “How Digital Finance Could Boost Growth in Emerging Economies,” McKinsey Global Institute.

Patients dying while waiting for an organ donor could soon be a thing of the past. By 2030, organs will be biologically 3D-printed on demand.

Source: “Healthcare in 2030: Goodbye Hospital, Hello Home-Spital,” World Economic Forum.

Resource management

On the edge of the Sahara, Morocco is building what will be the world’s largest solar power plant, capable of providing energy even after the sun sets.

Source: “Morocco Unveils A Massive Solar Power Plant in the Sahara,” NPR.

Morocco plans to generate 14% of its energy from solar by 2020, and hopes to eventually export solar energy to Europe.

Source: “The Colossal African Solar Farm That Could Power Europe,” BBC.

An extremely large city can lose as much as 500 billion liters of drinking water each year through leakage.

Source: “Water and Cities – Facts and Figures,” United Nations.

More than 300,000 billion liters of water could be saved globally by using new information and communications technologies to increase resource management efficiencies.

Source: “Quantifying the Opportunity,” Global e-Sustainability Initiative (GeSI).

Barcelona uses the IoT to optimize urban systems and enhance citizen services. To date, it has saved $95 million annually from reduced water and electricity consumption, increased parking revenues by $50 million a year, and generated 47,000 new jobs.

Source: “How Smart City Barcelona Brought the Internet of Things to Life,” Data-Smart City Solutions, Ash Center for Democratic Governance and Innovation at Harvard Kennedy School.

By 2019, 40% of local and regional governments will use the IoT to turn infrastructure such as roads, streetlights, and traffic signals into assets instead of liabilities.

Source: “IDC FutureScape: Worldwide Internet of Things (IoT) 2017 Predictions,” IDC Research Inc.

Global urban populations will add 2.5 billion people by 2050. This massive urban expansion will require as much as $70 trillion in infrastructure spending.

Source: “In a Fast-Changing World, Can Cities Be Built with Long-Term Perspective?” EY.

Global debt has more than doubled since the turn of the century to $152 trillion and now represents a record high 225% of global GDP. This creates a vicious feedback loop in which the debt overhang exacerbates the economic slowdown and lower economic growth hampers deleveraging.

Source: “The IMF Is Worried About the World’s $152 Trillion Debt Pile,” Bloomberg.

Trust and corruption

93% of CEOs believe it’s important to engender trust that their company “will do the right thing.”

Source: “Connecting the Dots: How Purpose Can Join Up Your Business,” PwC.

72% of people feel that companies have become more dishonest.

Source: “The State of the Debate on Purpose in Business,” EY Beacon Institute.

There is a growing level of distrust: only 15% of people believe that society’s institutional pillars (government, businesses, media) are working for the common person.

Source: “2017 Edelman Trust Barometer,” Edelman.

Leading up to the U.S. election, the top fake news stories on Facebook generated 20% more engagement than factual stories.

Source: “This Analysis Shows How Fake Election News Stories Outperformed Real News on Facebook,” BuzzFeed News.

Bribery reduces global GDP by $1.5 trillion to $2 trillion each year, as it drives suboptimal business decision making, corrupting economic performance.

Source: “Corruption: Costs and Mitigating Strategies,” International Monetary Fund.

To combat corruption and tax evasion in its cash economy (only 2.6% of its citizens pay taxes), the Indian government devalued 80% of its currency in three hours.

Source: “Demonetization | This Is a New Indian Sunrise,” DNA India.

India could eliminate the need for credit cards, debit cards, and ATMs in three years by switching to biometric payments, as nearly 1.1 billion citizens have already registered their biometric data.

Source: “First Cash, Now India Could Ditch Card Payments by 2020,” CNN.

Purpose

In a study of 100 variables, seeing purpose and value in work was the single most important factor that motivated employees. Yes, more than compensation.

Source: “Purpose Trumps Cash + Other New Research Findings,” LinkedIn.

75% of millennials would take a pay cut to work for a socially and environmentally responsible company.

Source: “2016 Cone Communications Millennial Employee Engagement Study,” Cone Communications.

Only 13% of employees worldwide are engaged, meaning that the other 87% are not involved in, enthusiastic about, and committed to their work and company.

Source: “The Worldwide Employee Engagement Crisis,” Gallup.

Companies with engaged employees outperform their peers by up to 202%.

Source: “The Importance of Employee Engagement,” Dale Carnegie Training.

How millennials want to work and live is a problem leaders need to take seriously. Just 40% of millennials feel strongly connected to their company’s mission.

Source: “Millennials Not Connecting With Their Company’s Mission,” Gallup.

During the next year, one in four millennials plans to leave his or her current employer, and by 2020, two in three millennials expect to have found a new employer.

Source: “The 2016 Deloitte Millennial Survey – Winning over the next generation of leaders,” Deloitte.

Organizations in which employees perceive meaning at work are 21% more profitable.

Source: “Meaning@Work, Leadership in times of digitization,” Future of Leadership Initiative.

87% of millennials say that they base their purchasing decisions on whether or not a company makes positive social efforts.

Source: “Why Millennials Care About Purpose-Driven Business,” D!gitalist Magazine.

To view all of the 99 Facts on the Future of Business in the Digital Economy, check out the Slideshare or other subsets below.

 

To see the rest of the series, check out our page Facts on the Future of Business,” every Thursday, and we will cover the six topics:

  • The value imperative to embrace the digital economy
  • Technologies driving the digital economy
  • Customer experience and marketing in digital economy
  • The future of work in the digital economy
  • Purpose and sustainability in the digital economy
  • Supply networks in the digital economy
Comments

Peter Johnson

About Peter Johnson

Peter Johnson is a Senior Director of Marketing Strategy and Thought Leadership at SAP, responsible for developing easy to understand corporate level and cross solution messaging. Peter has proven experience leading innovative programs to accelerate and scale Go-To-Market activities, and drive operational efficiencies at industry leading solution providers and global manufactures respectively.

SolarCoin: How Blockchain Is Incentivizing A 5,000 Gigawatt Quest To Save The Planet

Jacqueline Prause

A trip through the idyllic farmlands of Bavaria, in the south of Germany, is a testament to the enthusiasm here for solar energy, one of the many green technologies Germany is embracing as it undergoes a national Energy Transition to using 60% renewable energy sources by 2050.

Here, aging dairy barns are entirely covered with solar panels to capture the sun’s rays that come over the nearby Alpine peaks. Munching on meadow grasses, the bovine residents hardly seem perturbed by the barns’ hyper-modern solar installations – and the barn itself is thus doubly productive for the farmer.

Worldwide there are currently about 7 million solar installations already grid-connected, amounting to some 300 gigawatts (GW) of energy capacity, roughly the equivalent generation capacity of 500 nuclear reactors. Germany alone had a solar generation capacity of 41 GW in 2016, delivering more than six percent of its total energy consumption – impressive for a country that lies well outside the sun belt.

With climate change experts forecasting that global warming could possibly increase the earth’s temperature two degrees Celsius by mid-century, people are increasingly turning to low-carbon energy sources like solar, wind, and hydropower to mitigate the effects of global warming. According to the International Energy Agency (IEA), solar energy could become the largest source of electricity by 2050 – ahead of other energy sources like fossil fuels, wind, and hydro. Based on figures from the IEA, the solar community hopes to couple another 5,000 GW of solar power to the grid – that’s the equivalent of an additional 200 million households using solar power.

“We are the first generation to live global climate change in real-time and to feel it, but we are also the last generation to be able to do something about it,” Francois Sonnet, co-founder of ElectriCChain, an affiliate of the SolarCoin Foundation, recently told members of the European Parliament in a session on science and technology options (STOA). “The technology is there; the will and the money isn’t.”

We are the first generation to live climate change in real time but the last to be able to do something about it.

Incentivizing solar uptake

Rather than wait for more money to flow into the solar industry, the SolarCoin Foundation, based in Greenwich, Connecticut, is incentivizing solar production for participating households and businesses, one megawatt-hour at a time – and it’s using blockchain technology to do it.

Founded in 2014 by a group of solar experts and macro-economists, the SolarCoin Foundation is an international network of volunteers and community members whose job it is to oversee the distribution of SolarCoins (cryptoexchange symbol: SLR) – a blockchain-based digital currency that is distributed to solar producers at a rate of one coin per megawatt-hour of solar energy produced, based on verified meter readings. The organization maintains a public ledger that records each SolarCoin given out to solar electricity generators.

The SolarCoin is both an incentive and a reward for solar producers to participate in the solar economy. The program is often likened to air miles for frequent flyers in the airline industry. “Basically, you enable a prosumer to deliver to the grid and to use the infrastructure – or to set a whole new infrastructure, in the case of micro-grids in developing countries – and to bill energy to a neighbor. That’s the purpose of peer-to-peer energy,” explains Sonnet.

SolarCoin is active in 32 countries, with a network of affiliates and partners to serve different regions: Solar Change is active in South America, EMEA, and the United States; Solcrypto is the claims facilitator active in the Asia-Pacific region; and ElectriCChain, registered in Andorra, aims to record solar energy data for the purpose of the betterment of the solar tools as well as monitor human progress and academia.

SolarCoin is the first digital asset to be recognized at the supranational level by International Renewable Energy Agency (IRENA) as a source of financial support for the solar industry. Affiliate ElectricChain recently received high praise at the UN Climate Summit in Morocco, winning the “Homes” category for its groundbreaking nano-grid project. The recognition has served to provide legitimization to the organization as it promotes its work around the world. “Depending on the places we go to, some people don’t necessarily know the good story of solar energy,” says Sonnet. “Working with these big institutions, like the UN and IRENA, certainly helps.”

Harvesting SolarCoins at home

To manage the distribution of individual SolarCoins, the SolarCoin Foundation operates on its own blockchain, which is 200-300 times more carbon efficient than the limited processing capabilities of the Bitcoin blockchain (SolarCoin Blockchain explorer is available here). Plus, each SolarCoin has real value attached to it: one coin is the equivalent of one megawatt hour of solar power. By way of example, Sonnet explains how people can accumulate the SolarCoins: “Say five kilowatts of solar generation in Munich would produce six megawatt hours, so that would be six SolarCoins on a yearly basis.”

Approximately 420,000 SolarCoins have been granted to solar producers worldwide. There are currently USD$500 million of SolarCoins waiting to be claimed. The SolarCoin Foundation expects that its distribution program will last 40 years as it distributes the 97.5 billion SolarCoins, which represent 97,500 terawatt hours of solar electricity.

For individual owners, the coins are distributed once every six months through the platform to the owner. No equipment is necessary outside of a solar installation. A solar producer would harvest SolarCoins in two ways: 1. download the digital wallet from the SolarCoin web site; it will embed an API that enables you to claim SolarCoins directly from the blockchain; or 2. use a $10 piece of equipment called a Raspberry PI, which is a data logger that gathers information from the solar installation and publishes it to the blockchain. Currently, there are also discussions in progress to have some solar equipment providers embed SolarCoin in a more dynamic way onto solar equipment they sell, enabling granting down to the minute. Find out how to enroll with SolarCoin here.

Seeding the solar economy

What can you do with your SolarCoins? Most people will likely exchange the digital currency for euros or dollars on one of many cryptoexchanges. At the moment, the value of an individual SolarCoin is around $0.24. Perhaps a better idea might be to hold the coin until more people join the network. Like many blockchain-based ventures, the value of the network increases markedly as more people join, adding more nodes and producing more transactions. The larger the network, the greater its value. This video from SolarCoin founder Nick Gogerty explains the concept of currency valuation in the network.

The SolarCoin Foundation expects to have one million participants by the end of 2019. This would provide the uplift to bring the value of one coin to between $20 and $30 per megawatt hour. As noted in Scientific American, “For now, the handouts act as a reward – a little token of thanks – to the people who are already doing their part for the environment.”

The envisioned future for the currency is that prosumers will be able to use SolarCoins to directly pay for goods and services, seeding the solar economy, which might, for example, include battery storage and additional services. “To the extent that solar participants understand that, then it makes a compelling case for SolarCoin,” says Sonnet.

According to Sonnet, SolarCoin plans to onboard “a couple dozen thousand” solar installations in the medium-term. This will have the effect of triggering the next wave of installation owners to participate in the network, and for businesses to join the network too. “We’re at the very beginning of the value creation of SolarCoin,” says Sonnet. “It takes time obviously, but it’s like a spinning wheel: once the cogs are in place, it starts spinning by itself, and this will enable the Energy Transition.”

For more on blockchain’s role in connected cities, see Running Future Cities On Blockchain.

Comments

About Jacqueline Prause

Jacqueline Prause is the Senior Managing Editor of Media Channels at SAP. She writes, edits, and coordinates journalistic content for SAP.info, SAP's global online news magazine for customers, partners, and business influencers .

Heroes in the Race to Save Antibiotics

Dr. David Delaney, Joseph Miles, Walt Ellenberger, Saravana Chandran, and Stephanie Overby

Last August, a woman arrived at a Reno, Nevada, hospital and told the attending doctors that she had recently returned from an extended trip to India, where she had broken her right thighbone two years ago. The woman, who was in her 70s, had subsequently developed an infection in her thigh and hip for which she was hospitalized in India several times. The Reno doctors recognized that the infection was serious—and the visit to India, where antibiotic-resistant bacteria runs rampant, raised red flags.

When none of the 14 antibiotics the physicians used to treat the woman worked, they sent a sample of the bacterium to the U.S. Centers for Disease Control (CDC) for testing. The CDC confirmed the doctors’ worst fears: the woman had a class of microbe called carbapenem-resistant Enterobacteriaceae (CRE). Carbapenems are a powerful class of antibiotics used as last-resort treatment for multidrug-resistant infections. The CDC further found that, in this patient’s case, the pathogen was impervious to all 26 antibiotics approved by the U.S. Food and Drug Administration (FDA).

In other words, there was no cure.

This is just the latest alarming development signaling the end of the road for antibiotics as we know them. In September, the woman died from septic shock, in which an infection takes over and shuts down the body’s systems, according to the CDC’s Morbidity and Mortality Weekly Report.

Other antibiotic options, had they been available, might have saved the Nevada woman. But the solution to the larger problem won’t be a new drug. It will have to be an entirely new approach to the diagnosis of infectious disease, to the use of antibiotics, and to the monitoring of antimicrobial resistance (AMR)—all enabled by new technology.

But that new technology is not being implemented fast enough to prevent what former CDC director Tom Frieden has nicknamed nightmare bacteria. And the nightmare is becoming scarier by the year. A 2014 British study calculated that 700,000 people die globally each year because of AMR. By 2050, the global cost of antibiotic resistance could grow to 10 million deaths and US$100 trillion a year, according to a 2014 estimate. And the rate of AMR is growing exponentially, thanks to the speed with which humans serving as hosts for these nasty bugs can move among healthcare facilities—or countries. In the United States, for example, CRE had been seen only in North Carolina in 2000; today it’s nationwide.

Abuse and overuse of antibiotics in healthcare and livestock production have enabled bacteria to both mutate and acquire resistant genes from other organisms, resulting in truly pan-drug resistant organisms. As ever-more powerful superbugs continue to proliferate, we are potentially facing the deadliest and most costly human-made catastrophe in modern times.

“Without urgent, coordinated action by many stakeholders, the world is headed for a post-antibiotic era, in which common infections and minor injuries which have been treatable for decades can once again kill,” said Dr. Keiji Fukuda, assistant director-general for health security for the World Health Organization (WHO).

Even if new antibiotics could solve the problem, there are obstacles to their development. For one thing, antibiotics have complex molecular structures, which slows the discovery process. Further, they aren’t terribly lucrative for pharmaceutical manufacturers: public health concerns call for new antimicrobials to be financially accessible to patients and used conservatively precisely because of the AMR issue, which reduces the financial incentives to create new compounds. The last entirely new class of antibiotic was introduced 30 year ago. Finally, bacteria will develop resistance to new antibiotics as well if we don’t adopt new approaches to using them.

Technology can play the lead role in heading off this disaster. Vast amounts of data from multiple sources are required for better decision making at all points in the process, from tracking or predicting antibiotic-resistant disease outbreaks to speeding the potential discovery of new antibiotic compounds. However, microbes will quickly adapt and resist new medications, too, if we don’t also employ systems that help doctors diagnose and treat infection in a more targeted and judicious way.

Indeed, digital tools can help in all four actions that the CDC recommends for combating AMR: preventing infections and their spread, tracking resistance patterns, improving antibiotic use, and developing new diagnostics and treatment.

Meanwhile, individuals who understand both the complexities of AMR and the value of technologies like machine learning, human-computer interaction (HCI), and mobile applications are working to develop and advocate for solutions that could save millions of lives.

Keeping an Eye Out for Outbreaks

Like others who are leading the fight against AMR, Dr. Steven Solomon has no illusions about the difficulty of the challenge. “It is the single most complex problem in all of medicine and public health—far outpacing the complexity and the difficulty of any other problem that we face,” says Solomon, who is a global health consultant and former director of the CDC’s Office of Antimicrobial Resistance.

Solomon wants to take the battle against AMR beyond the laboratory. In his view, surveillance—tracking and analyzing various data on AMR—is critical, particularly given how quickly and widely it spreads. But surveillance efforts are currently fraught with shortcomings. The available data is fragmented and often not comparable. Hospitals fail to collect the representative samples necessary for surveillance analytics, collecting data only on those patients who experience resistance and not on those who get better. Laboratories use a wide variety of testing methods, and reporting is not always consistent or complete.

Surveillance can serve as an early warning system. But weaknesses in these systems have caused public health officials to consistently underestimate the impact of AMR in loss of lives and financial costs. That’s why improving surveillance must be a top priority, says Solomon, who previously served as chair of the U.S. Federal Interagency Task Force on AMR and has been tracking the advance of AMR since he joined the U.S. Public Health Service in 1981.

A Collaborative Diagnosis

Ineffective surveillance has also contributed to huge growth in the use of antibiotics when they aren’t warranted. Strong patient demand and financial incentives for prescribing physicians are blamed for antibiotics abuse in China. India has become the largest consumer of antibiotics on the planet, in part because they are prescribed or sold for diarrheal diseases and upper respiratory infections for which they have limited value. And many countries allow individuals to purchase antibiotics over the counter, exacerbating misuse and overuse.

In the United States, antibiotics are improperly prescribed 50% of the time, according to CDC estimates. One study of adult patients visiting U.S. doctors to treat respiratory problems found that more than two-thirds of antibiotics were prescribed for conditions that were not infections at all or for infections caused by viruses—for which an antibiotic would do nothing. That’s 27 million courses of antibiotics wasted a year—just for respiratory problems—in the United States alone.

And even in countries where there are national guidelines for prescribing antibiotics, those guidelines aren’t always followed. A study published in medical journal Family Practice showed that Swedish doctors, both those trained in Sweden and those trained abroad, inconsistently followed rules for prescribing antibiotics.

Solomon strongly believes that, worldwide, doctors need to expand their use of technology in their offices or at the bedside to guide them through a more rational approach to antibiotic use. Doctors have traditionally been reluctant to adopt digital technologies, but Solomon thinks that the AMR crisis could change that. New digital tools could help doctors and hospitals integrate guidelines for optimal antibiotic prescribing into their everyday treatment routines.

“Human-computer interactions are critical, as the amount of information available on antibiotic resistance far exceeds the ability of humans to process it,” says Solomon. “It offers the possibility of greatly enhancing the utility of computer-assisted physician order entry (CPOE), combined with clinical decision support.” Healthcare facilities could embed relevant information and protocols at the point of care, guiding the physician through diagnosis and prescription and, as a byproduct, facilitating the collection and reporting of antibiotic use.

Cincinnati Children’s Hospital’s antibiotic stewardship division has deployed a software program that gathers information from electronic medical records, order entries, computerized laboratory and pathology reports, and more. The system measures baseline antimicrobial use, dosing, duration, costs, and use patterns. It also analyzes bacteria and trends in their susceptibilities and helps with clinical decision making and prescription choices. The goal, says Dr. David Haslam, who heads the program, is to decrease the use of “big gun” super antibiotics in favor of more targeted treatment.

While this approach is not yet widespread, there is consensus that incorporating such clinical-decision support into electronic health records will help improve quality of care, contain costs, and reduce overtreatment in healthcare overall—not just in AMR. A 2013 randomized clinical trial finds that doctors who used decision-support tools were significantly less likely to order antibiotics than those in the control group and prescribed 50% fewer broad-spectrum antibiotics.

Putting mobile devices into doctors’ hands could also help them accept decision support, believes Solomon. Last summer, Scotland’s National Health Service developed an antimicrobial companion app to give practitioners nationwide mobile access to clinical guidance, as well as an audit tool to support boards in gathering data for local and national use.

“The immediacy and the consistency of the input to physicians at the time of ordering antibiotics may significantly help address the problem of overprescribing in ways that less-immediate interventions have failed to do,” Solomon says. In addition, handheld devices with so-called lab-on-a-chip  technology could be used to test clinical specimens at the bedside and transmit the data across cellular or satellite networks in areas where infrastructure is more limited.

Artificial intelligence (AI) and machine learning can also become invaluable technology collaborators to help doctors more precisely diagnose and treat infection. In such a system, “the physician and the AI program are really ‘co-prescribing,’” says Solomon. “The AI can handle so much more information than the physician and make recommendations that can incorporate more input on the type of infection, the patient’s physiologic status and history, and resistance patterns of recent isolates in that ward, in that hospital, and in the community.”

Speed Is Everything

Growing bacteria in a dish has never appealed to Dr. James Davis, a computational biologist with joint appointments at Argonne National Laboratory and the University of Chicago Computation Institute. The first of a growing breed of computational biologists, Davis chose a PhD advisor in 2004 who was steeped in bioinformatics technology “because you could see that things were starting to change,” he says. He was one of the first in his microbiology department to submit a completely “dry” dissertation—that is, one that was all digital with nothing grown in a lab.

Upon graduation, Davis wanted to see if it was possible to predict whether an organism would be susceptible or resistant to a given antibiotic, leading him to explore the potential of machine learning to predict AMR.

As the availability of cheap computing power has gone up and the cost of genome sequencing has gone down, it has become possible to sequence a pathogen sample in order to detect its AMR resistance mechanisms. This could allow doctors to identify the nature of an infection in minutes instead of hours or days, says Davis.

Davis is part of a team creating a giant database of bacterial genomes with AMR metadata for the Pathosystems Resource Integration Center (PATRIC), funded by the U.S. National Institute of Allergy and Infectious Diseases to collect data on priority pathogens, such as tuberculosis and gonorrhea.

Because the current inability to identify microbes quickly is one of the biggest roadblocks to making an accurate diagnosis, the team’s work is critically important. The standard method for identifying drug resistance is to take a sample from a wound, blood, or urine and expose the resident bacteria to various antibiotics. If the bacterial colony continues to divide and thrive despite the presence of a normally effective drug, it indicates resistance. The process typically takes between 16 and 20 hours, itself an inordinate amount of time in matters of life and death. For certain strains of antibiotic-resistant tuberculosis, though, such testing can take a week. While physicians are waiting for test results, they often prescribe broad-spectrum antibiotics or make a best guess about what drug will work based on their knowledge of what’s happening in their hospital, “and in the meantime, you either get better,” says Davis, “or you don’t.”

At PATRIC, researchers are using machine-learning classifiers to identify regions of the genome involved in antibiotic resistance that could form the foundation for a “laboratory free” process for predicting resistance. Being able to identify the genetic mechanisms of AMR and predict the behavior of bacterial pathogens without petri dishes could inform clinical decision making and improve reaction time. Thus far, the researchers have developed machine-learning classifiers for identifying antibiotic resistance in Acinetobacter baumannii (a big player in hospital-acquired infection), methicillin-resistant Staphylococcus aureus (a.k.a. MRSA, a worldwide problem), and Streptococcus pneumoniae (a leading cause of bacterial meningitis), with accuracies ranging from 88% to 99%.

Houston Methodist Hospital, which uses the PATRIC database, is researching multidrug-resistant bacteria, specifically MRSA. Not only does resistance increase the cost of care, but people with MRSA are 64% more likely to die than people with a nonresistant form of the infection, according to WHO. Houston Methodist is investigating the molecular genetic causes of drug resistance in MRSA in order to identify new treatment approaches and help develop novel antimicrobial agents.

The Hunt for a New Class of Antibiotics

There are antibiotic-resistant bacteria, and then there’s Clostridium difficile—a.k.a. C. difficile—a bacterium that attacks the intestines even in young and healthy patients in hospitals after the use of antibiotics.

It is because of C. difficile that Dr. L. Clifford McDonald jumped into the AMR fight. The epidemiologist was finishing his work analyzing the spread of SARS in Toronto hospitals in 2004 when he turned his attention to C. difficile, convinced that the bacteria would become more common and more deadly. He was right, and today he’s at the forefront of treating the infection and preventing the spread of AMR as senior advisor for science and integrity in the CDC’s Division of Healthcare Quality Promotion. “[AMR] is an area that we’re funding heavily…insofar as the CDC budget can fund anything heavily,” says McDonald, whose group has awarded $14 million in contracts for innovative anti-AMR approaches.

Developing new antibiotics is a major part of the AMR battle. The majority of new antibiotics developed in recent years have been variations of existing drug classes. It’s been three decades since the last new class of antibiotics was introduced. Less than 5% of venture capital in pharmaceutical R&D is focused on antimicrobial development. A 2008 study found that less than 10% of the 167 antibiotics in development at the time had a new “mechanism of action” to deal with multidrug resistance. “The low-hanging fruit [of antibiotic development] has been picked,” noted a WHO report.

Researchers will have to dig much deeper to develop novel medicines. Machine learning could help drug developers sort through much larger data sets and go about the capital-intensive drug development process in a more prescriptive fashion, synthesizing those molecules most likely to have an impact.

McDonald believes that it will become easier to find new antibiotics if we gain a better understanding of the communities of bacteria living in each of us—as many as 1,000 different types of microbes live in our intestines, for example. Disruption to those microbial communities—our “microbiome”—can herald AMR. McDonald says that Big Data and machine learning will be needed to unlock our microbiomes, and that’s where much of the medical community’s investment is going.

He predicts that within five years, hospitals will take fecal samples or skin swabs and sequence the microorganisms in them as a kind of pulse check on antibiotic resistance. “Just doing the bioinformatics to sort out what’s there and the types of antibiotic resistance that might be in that microbiome is a Big Data challenge,” McDonald says. “The only way to make sense of it, going forward, will be advanced analytic techniques, which will no doubt include machine learning.”

Reducing Resistance on the Farm

Bringing information closer to where it’s needed could also help reduce agriculture’s contribution to the antibiotic resistance problem. Antibiotics are widely given to livestock to promote growth or prevent disease. In the United States, more kilograms of antibiotics are administered to animals than to people, according to data from the FDA.

One company has developed a rapid, on-farm diagnostics tool to provide livestock producers with more accurate disease detection to make more informed management and treatment decisions, which it says has demonstrated a 47% to 59% reduction in antibiotic usage. Such systems, combined with pressure or regulations to reduce antibiotic use in meat production, could also help turn the AMR tide.

Breaking Down Data Silos Is the First Step

Adding to the complexity of the fight against AMR is the structure and culture of the global healthcare system itself. Historically, healthcare has been a siloed industry, notorious for its scattered approach focused on transactions rather than healthy outcomes or the true value of treatment. There’s no definitive data on the impact of AMR worldwide; the best we can do is infer estimates from the information that does exist.

The biggest issue is the availability of good data to share through mobile solutions, to drive HCI clinical-decision support tools, and to feed supercomputers and machine-learning platforms. “We have a fragmented healthcare delivery system and therefore we have fragmented information. Getting these sources of data all into one place and then enabling them all to talk to each other has been problematic,” McDonald says.

Collecting, integrating, and sharing AMR-related data on a national and ultimately global scale will be necessary to better understand the issue. HCI and mobile tools can help doctors, hospitals, and public health authorities collect more information while advanced analytics, machine learning, and in-memory computing can enable them to analyze that data in close to real time. As a result, we’ll better understand patterns of resistance from the bedside to the community and up to national and international levels, says Solomon. The good news is that new technology capabilities like AI and new potential streams of data are coming online as an era of data sharing in healthcare is beginning to dawn, adds McDonald.

The ideal goal is a digitally enabled virtuous cycle of information and treatment that could save millions of dollars, lives, and perhaps even civilization if we can get there. D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Dr. David Delaney is Chief Medical Officer for SAP.

Joseph Miles is Global Vice President, Life Sciences, for SAP.

Walt Ellenberger is Senior Director Business Development, Healthcare Transformation and Innovation, for SAP.

Saravana Chandran is Senior Director, Advanced Analytics, for SAP.

Stephanie Overby is an independent writer and editor focused on the intersection of business and technology.

Comments

Tags:

4 Traits Set Digital Leaders Apart From 97% Of The Competition

Vivek Bapat

Like the classic parable of the blind man and the elephant, it seems everyone has a unique take on digital transformation. Some equate digital transformation with emerging technologies, placing their bets on as the Internet of Things, machine learning, and artificial intelligence. Others see it as a way to increase efficiencies and change business processes to accelerate product to market. Some others think of it is a means of strategic differentiation, innovating new business models for serving and engaging their customers. Despite the range of viewpoints, many businesses are still challenged with pragmatically evolving digital in ways that are meaningful, industry-disruptive, and market-leading.

According to a recent study of more than 3,000 senior executives across 17 countries and regions, only a paltry three percent of businesses worldwide have successfully completed enterprise-wide digital transformation initiatives, even though 84% of C-level executives ranks such efforts as “critically important” to the fundamental sustenance of their business.

The most comprehensive global study of its kind, the SAP Center for Business Insight report “SAP Digital Transformation Executive Study: 4 Ways Leaders Set Themselves Apart,” in collaboration with Oxford Economics, identified the challenges, opportunities, value, and key technologies driving digital transformation. The findings specifically analyzed the performance of “digital leaders” – those who are connecting people, things, and businesses more intelligently, more effectively, and creating punctuated change faster than their less advanced rivals.

After analyzing the data, it was eye-opening to see that only three percent of companies (top 100) are successfully realizing their full potential through digital transformation. However, even more remarkable was that these leaders have four fundamental traits in common, regardless of their region of operation, their size, their organizational structure, or their industry.

We distilled these traits in the hope that others in the early stages of transformation or that are still struggling to find their bearings can embrace these principles in order to succeed. Ultimately I see these leaders as true ambidextrous organizations, managing evolutionary and revolutionary change simultaneously, willing to embrace innovation – not just on the edges of their business, but firmly into their core.

Here are the four traits that set these leaders apart from the rest:

Trait #1: They see digital transformation as truly transformational

An overwhelming majority (96%) of digital leaders view digital transformation as a core business goal that requires a unified digital mindset across the entire enterprise. But instead of allowing individual functions to change at their own pace, digital leaders prefer to evolve the organization to help ensure the success of their digital strategies.

The study found that 56% of these businesses regularly shift their organizational structure, which includes processes, partners, suppliers, and customers, compared to 10% of remaining companies. Plus, 70% actively bring lines of business together through cross-functional processes and technologies.

By creating a firm foundation for transformation, digital leaders are further widening the gap between themselves and their less advanced competitors as they innovate business models that can mitigate emerging risks and seize new opportunities quickly.

Trait #2: They focus on transforming customer-facing functions first

Although most companies believe technology, the pace of change, and growing global competition are the key global trends that will affect everything for years to come, digital leaders are expanding their frame of mind to consider the influence of customer empowerment. Executives who build a momentum of breakthrough innovation and industry transformation are the ones that are moving beyond the high stakes of the market to the activation of complete, end-to-end customer experiences.

In fact, 92% of digital leaders have established sophisticated digital transformation strategies and processes to drive transformational change in customer satisfaction and engagement, compared to 22% of their less mature counterparts. As a result, 70% have realized significant or transformational value from these efforts.

Trait #3: They create a virtuous cycle of digital talent

There’s little doubt that the competition for qualified talent is fierce. But for nearly three-quarters of companies that demonstrate digital-transformation leadership, it is easier to attract and retain talent because they are five times more likely to leverage digitization to change their talent management efforts.

The impact of their efforts goes beyond empowering recruiters to identify best-fit candidates, highlight risk factors and hiring errors, and predict long-term talent needs. Nearly half (48%) of digital leaders understand that they must invest heavily in the development of digital skills and technology to drive revenue, retain productive employees, and create new roles to keep up with their digital maturity over the next two years, compared to 30% of all surveyed executives.

Trait #4: They invest in next-generation technology using a bimodal architecture

A couple years ago, Peter Sondergaard, senior vice president at Gartner and global head of research, observed that “CIOs can’t transform their old IT organization into a digital startup, but they can turn it into a bi-modal IT organization. Forty-five percent of CIOs state they currently have a fast mode of operation, and we predict that 75% of IT organizations will be bimodal in some way by 2017.”

Based on the results of the SAP Center for Business Insight study, Sondergaard’s prediction was spot on. As digital leaders dive into advanced technologies, 72% are using a digital twin of the conventional IT organization to operate efficiently without disruption while refining innovative scenarios to resolve business challenges and integrate them to stay ahead of the competition. Unfortunately, only 30% of less advanced businesses embrace this view.

Working within this bimodal architecture is emboldening digital leaders to take on incredibly progressive technology. For example, the study found that 50% of these firms are using artificial intelligence and machine learning, compared to seven percent of all respondents. They are also leading the adoption curve of Big Data solutions and analytics (94% vs. 60%) and the Internet of Things (76% vs. 52%).

Digital leadership is a practice of balance, not pure digitization

Most executives understand that digital transformation is a critical driver of revenue growth, profitability, and business expansion. However, as digital leaders are proving, digital strategies must deliver a balance of organizational flexibility, forward-looking technology adoption, and bold change. And clearly, this approach is paying dividends for them. They are growing market share, increasing customer satisfaction, improving employee engagement, and, perhaps more important, achieving more profitability than ever before.

For any company looking to catch up to digital leaders, the conversation around digital transformation needs to change immediately to combat three deadly sins: Stop investing in one-off, isolated projects hidden in a single organization. Stop viewing IT as an enabler instead of a strategic partner. Stop walling off the rest of the business from siloed digital successes.

As our study shows, companies that treat their digital transformation as an all-encompassing, all-sharing, and all-knowing business imperative will be the ones that disrupt the competitive landscape and stay ahead of a constantly evolving economy.

Follow me on twitter @vivek_bapat 

For more insight on digital leaders, check out the SAP Center for Business Insight report, conducted in collaboration with Oxford Economics,SAP Digital Transformation Executive Study: 4 Ways Leaders Set Themselves Apart.”

Comments

About Vivek Bapat

Vivek Bapat is the Senior Vice President, Global Head of Marketing Strategy and Thought Leadership, at SAP. He leads SAP's Global Marketing Strategy, Messaging, Positioning and related Thought Leadership initiatives.