Sections

The Jobs Of The Future Are Yet To Be Written

Hessie Jones

Dont_be_a_Job_Hopper-_-_NARA_-_514129-733x1024I’ve been told I’m a fatalist. Perhaps I am. From all the research I’ve done on the Future of Work and the amazing peers I’ve collaborated with in this category, it is  clear to me that the past definitely does not dictate the future. As this image implies, there was a time when we all expected to follow a common path: from school to a job for life.

I’ve seen firsthand how technology has wielded its way into the marketing sector. It’s changed the way people consume information and how they communicate with each other. This disruption in communication has turned the marketing industry on its head, and we find ourselves continuously struggling to keep up. Within a decade, I have seen colleagues’ careers cut short – those who have banked tremendous experience in traditional mass communications: TV, print, radio, and promotions.

The communications industry is in a perpetual state of disruption

By 2005 a new crop of roles had emerged, necessitated by the increased demand for web-based solutions: U/X design, online advertising, PPC management, community management, and web development. Now with the rise of ad-blocking and ever-increasing consumer demand for transparency and privacy, the online media industry is already being disrupted. The very jobs that were in high demand a decade earlier will need some necessary modifications. In my recent post, Have Consumers Won? Ad Blockers and the Demise of the Industry, I concluded:

Consumers are sending a message to the industry. They don’t want the current web experience, interrupted with annoying ads and messaging. They have been running away from it for sometime. The adoption of Ad blockers are just the beginning.

The pace of technological disruption has become so fast within the communications industry that marketers are turning to MOOCs and reference blogs just to keep up.

We are not alone. All industries are experiencing this.

“We are on the cusp of the Fourth Industrial Revolution”

The World Economic Forum recently published a report that “represented more than 13 million employees across 9 broad industry sectors in 15 major developed and emerging economies and regional economic areas.” The report indicates:

Developments in previously disjointed fields such as artificial intelligence and machine learning, robotics, nanotechnology, 3D printing, and genetics and biotechnology are all building on and amplifying one another.

Smart systems—homes, factories, farms, grids, or entire cities—will help tackle problems ranging from supply chain management to climate change.

The chart below cites the instigating variables that are forcing industries and organizations to undergo major structural and operational changes, from prevailing mobile consumption to the increasing efficiency of cloud computing to Big Data usage.

techwef

We are also witnessing a decline in pure labor. This article uses the term “transitioning labor,” which is disconcerting in and of itself. The larger losses (between 2015-2020) will be felt in office and administration and manufacturing and production, to the tune of 7.1 million jobs in the U.S.

wef2

Is labor dead?

What technology has enabled is an exponential surge in productivity that has greatly improved gross domestic production and efficiency in the meantime.

The rise of the industrial revolution depended on human capital to produce things to meet market demand. WWII … the growth of the automotive industry… for decades, industries have relied on workers to improve productivity. The effect was cyclical:

…as businesses generated more value from their workers, the country as a whole became richer, which fueled more economic activity and created even more jobs.

Consider this, however: Based on the chart below, productivity and employment, which for decades correlated, began to diverge around the year 2000. As production strengthens, employment begins to wane. Over time, the economic growth in certain sectors do not rely on creation of jobs.

Note the growing disparity between GDP and household income since 1975.  The median household income overtime remains stagnant.

Authors of Race Against The Machine: How the Digital Revolution is Accelerating Innovation, Driving Productivity, and Irreversibly Transforming Employment and the Economy Eric Brynjolfsson, professor of MIT Sloan School of Management, and Andrew McAfee call this the “great decoupling.”

destroying.jobs_.chart1x910_0

Automation and robotics are already here

Advances are happening everywhere. A recent study by academics at Oxford University suggests that 47% of today’s jobs could be automated in the next two decades.

W. Brian Arthur, a former economics professor at Stanford University, calls it the “autonomous economy.”

As Brynjolfsson asserts:

People are falling behind because technology is advancing so fast and our skills and organizations aren’t keeping up.

The socioeconomic implications of this transition period are being felt today. Despite the larger economic impacts of job losses, those who are thriving fall within technology sectors and industries relying on the next-gen skill sets to respond to this increasingly competitive environment. Consider how Uber is currently impacting regulations in the taxi industry… or how AirBNB is leaving the smaller hotel chains scrambling… or how Tesla and Google will bring the car industry to its knees.

This is progress. But it’s also widening the skills gap and polarizing an economy.

Education is the starting point

It’s clear our current education is ill-equipped to prepare future generations for the impact of these ongoing changes. Consider the vantage point of a student entering college today and in the coming years. Accenture notes:

More Americans go to college than ever. But how many think about the return they will get from tuition payments that can easily reach $200,000? Up to half are unemployed or underemployed a year after graduation. And two-thirds say they need further training and instruction to enter the workforce.

As student debt balloons, it’s time for society to reevaluate post-secondary education—and our entire system. We need to create new and innovative systems that help individuals achieve their potential.

It’s clear that the Harvards and MITs of the world will need to change their models in the future as well. Free or more affordable education in the form of MOOCs (massive open online courses) and organizations like The Khan Academy or The Code Academy are already drawing students (of all ages) in droves.

The success of the Industrial Revolution or later was a result, in part, of the current education that was in step with the demands of the market. Not anymore.

The disconnect between the current education and the needs of organizations has never been clearer. PWC came out with this announcement last August and stated that degrees or A-level results will no longer be a criteria in assessing the value of potential candidates.

Here’s what I am observing today among peers and millennials: We are all in a mode of reinvention and continuous learning, striving to learn new skills in web development, writing, coding… all in the effort to be marketable under the new job conditions.

Machines can never replace humans

As part of our series on Humanity in Data, we will continue to explore the need for human-only abilities to help industry and society to progress in the coming decades. While automation and robotics will displace simple functions and be able to analyze zetabytes of information and deliver conclusions, human decision-making and creativity will be necessary to adjust to the growing challenges the world will face.

What’s exciting is that the resources available for learning are vast. As dismal as the current state of the nation is, we are at a critical point of reinvention that allows each and everyone of us to create opportunities to respond to real needs today.

I saw this on Business Insider: The future opportunities look like this:

1. Tele-surgeon: These surgeons operate on people remotely with robotic tools instead of human hands.

2. Nostalgist: Nostalgists are interior designers specializing in recreating memories for retired people. The elderly of 2030 who don’t want to reside in a typical “retirement village” will have the luxury of living in a space inspired by their favorite decade or place.

3. Re-wilder: These professionals were formally called “farmers.” The role of the re-wilder, however, is not to raise food crops but rather to undo environmental damage to the countryside caused by people, factories, cars, etc.

4. Simplicity expert: The simplicity experts of 2030 are interested in looking at how businesses can simplify and streamline their operations. For instance, they can reduce 15 administrative steps to three, or four interviews to one, or three days of work to a half-hour.

5. Garbage designer: Garbage designers find creative ways to turn the byproducts of the manufacturing process into high-quality materials for making entirely separate products.

6. Robot counselor: In 2030, robots will play a greater part in providing home care and services than they do today. The robot counselor will be a resource for picking the right bot for a family, by observing how the family interacts and identifying their needs and lifestyle.

7. Healthcare navigator: These professionals teach patients and their loved ones the ins and outs of a complicated medical system. The navigator also helps people to manage their contact with the medical system with the least amount of stress and delay.

8. Solar technology specialist: These specialists may own land where they manage a large spread of solar grids, to sell the harvested power to stations and other communities — or they may work as consultants in cities and other urban spaces, helping building owners to design, build, and maintain solar panels.

9. Aquaponic fish farmer: In 2030, populations of wild fish are disappearing — so new production methods like aquaponics will step in to replace fish that we can no longer catch in the wild. Aquaponics combines fish farming with gardening, where plants grow over water to cover its surface, while fish live below. The plants return oxygen to the water, and the fish produce waste that provides fertilizer for the plants.

Progress dictates a shift in mindset

Retirement doesn’t exist for many. Reinvention is mandatory to survival.

Organizations and the employable workforce are figuring this out as the needs of the market evolve.

For the next generation, perhaps the question needs to change from “What do you want to be when you grow up?” to  “What do you think needs fixing and what do you want to do to make a difference?

Want more insight on where the workplace is headed? See Social Collaboration: A Powerful Force In The Future Of Work.

Image source: Wikimedia

Comments

IoT Can Keep You Healthy — Even When You Sleep [VIDEO]

Christine Donato

Today the Internet of Things is revamping technology. IoT image from American Geniuses.jpg

Smart devices speak to each other and work together to provide the end user with a better product experience.

Coinciding with this change in technology is a change in people. We’ve transitioned from a world of people who love processed foods and french fries to people who eat kale chips and Greek yogurt…and actually like it.

People are taking ownership of their well-being, and preventative care is at the forefront of focus for both physicians and patients. Fitness trackers alert wearers of the exact number of calories burned from walking a certain number of steps. Mobile apps calculate our perfect nutritional balance. And even while we sleep, people are realizing that it’s important to monitor vitals.

According to research conducted at Harvard University, proper sleep patterns bolster healthy side effects such as improved immune function, a faster metabolism, preserved memory, and reduced stress and depression.

Conversely, the Harvard study determined that lack of sleep can negatively affect judgement, mood, and the ability retain information, as well as increase the risk of obesity, diabetes, cardiovascular disease, and even premature death.

Through the Internet of Things, researchers can now explore sleep patterns without the usual sleep labs and movement-restricting electrode wires. And with connected devices, individuals can now easily monitor and positively influence their own health.

EarlySense, a startup credited with the creation of continuous patient monitoring solutions focused on early detection of patient deterioration, mid-sleep falls, and pressure ulcers, began with a mission to prevent premature and preventable deaths.

Without constant monitoring, patients with unexpected clinical deterioration may be accidentally neglected, and their conditions can easily escalate into emergency situations.

Motivated by many instances of patients who died from preventable post-elective surgery complications, EarlySense founders created a product that constantly monitors patients when hospital nurses can’t, alerting the main nurse station when a patient leaves his or her bed and could potentially fall, or when a patient’s vital signs drop or rise unexpectedly.

Now EarlySense technology has expanded outside of the hospital realm. The EarlySense wellness sensor, a device connected via the Internet of Things, mobile solutions, and supported by SAP HANA Cloud Platform, monitors all vital signs while a person sleeps. The device is completely wireless and lies subtly underneath one’s mattress. The sensor collects all mechanical vibrations that the patient’s body emits while sleeping, continuously monitoring heart and respiratory rates.

Watch this short video to learn more about how the EarlySense wellness sensor works:

The result is faster diagnoses with better treatments and outcomes. Sleep issues can be identified and addressed; individuals can use the data collected to make adjustments in diet or exercise habits; and those on heavy pain medications can monitor the way their bodies react to the medication. In addition, physicians can use the data collected from the sensor to identify patient health problems before they escalate into an emergency situation.

Connected care is opening the door for a new way to practice health. Through connected care apps that link people with their doctors, fitness trackers that measure daily activity, and sensors like the EarlySense wellness sensor, today’s technology enables people and physicians to work together to prevent sickness and accidents before they occur. Technology is forever changing the way we live, and in turn we are living longer, healthier lives.

To learn how SAP HANA Cloud Platform can affect your business, visit It&Me.

For more stories, join me on Twitter.

Comments

About Christine Donato

Christine Donato is a Senior Integrated Marketing Specialist at SAP. She is an accomplished project manager and leader of multiple marketing and sales enablement campaigns and events, that supported a multi million euro business.

Zhena’s Gypsy Tea Brews Sustainable Growth On Cloud ERP

David Trites

Recently I had the pleasure of hosting a podcast with Paula Muesse, COO and CFO of Zhena’s Gypsy Tea, a small, organic, fair-trade tea company based in California, and Ursula Ringham from SAP. We talked about some of the business challenges Zhena’s faces and how the company’s ERP solution helped spur growth and digital transformation.

Small but complex business

~ERP helped Zhena’s sustain growthZhena’s has grown from one person (Zhena Muzyka) selling hand-packed tea from a cart, into a thriving small business that puts quality, sustainability, and fair trade first. And although the company is small its business is complex.

For starters, tea isn’t grown in the United States, so Zhena’s has to maintain and import inventory from multiple warehouses around the world. Some of their tea blends have up to 14 ingredients, and each one has a different lead time. That makes demand-planning difficult. In addition, the FDA and US Customs require designated ingredients be traced and treated a certain way to comply with regulations.

Being organic and fair trade also makes things more complicated. Zhena’s has to pass an annual organic compliance audit for all products and processing facilities. And all products need to be traceable back to the farms where the tea was grown and picked to ensure the workers (mostly women) are paid fair wages.

Sustainable growth

Prior to implementing its new ERP system, Zhena’s was using a mix of tools like QuickBooks, Excel, and paper to manage the business. But to sustain growth and ensure future success, the company had to make some changes. Zhena’s needed an integrated software solution that could handle all facets of the business. It needed a tool that could help with cost control and profitability analysis and facilitate complex reporting and regulatory requirements.

The SAP Business ByDesign solution was the perfect choice. The cloud-based ERP solution reduced both business and IT costs, simplified processes from demand planning to accounting, and enabled mobile access and real-time reporting.

Check out the podcast to hear more about how Zhena’s successfully transformed its business by moving to SAP Business ByDesign.

 This article originally appeared on SAP Business Trends.

Building a successful company is hard work. SAP’s affordable solutions for small and midsize companies are designed to make it easier. Simple to install and use, SAP SME Solutions help you automate and integrate your business processes to give real-time, actionable insights. So you can make decisions on the spot. Find out how Run Simple can work for you. Visit sap.com/sme.

Comments

About David Trites

David Trites is a Director of SAP Global Marketing. He is responsible for producing interesting and compelling customer stories that will humanize the SAP brand, support sales and marketing teams across SAP, and increase the awareness of SAP in key markets.

Heroes in the Race to Save Antibiotics

Dr. David Delaney, Joseph Miles, Walt Ellenberger, Saravana Chandran, and Stephanie Overby

Last August, a woman arrived at a Reno, Nevada, hospital and told the attending doctors that she had recently returned from an extended trip to India, where she had broken her right thighbone two years ago. The woman, who was in her 70s, had subsequently developed an infection in her thigh and hip for which she was hospitalized in India several times. The Reno doctors recognized that the infection was serious—and the visit to India, where antibiotic-resistant bacteria runs rampant, raised red flags.

When none of the 14 antibiotics the physicians used to treat the woman worked, they sent a sample of the bacterium to the U.S. Centers for Disease Control (CDC) for testing. The CDC confirmed the doctors’ worst fears: the woman had a class of microbe called carbapenem-resistant Enterobacteriaceae (CRE). Carbapenems are a powerful class of antibiotics used as last-resort treatment for multidrug-resistant infections. The CDC further found that, in this patient’s case, the pathogen was impervious to all 26 antibiotics approved by the U.S. Food and Drug Administration (FDA).

In other words, there was no cure.

This is just the latest alarming development signaling the end of the road for antibiotics as we know them. In September, the woman died from septic shock, in which an infection takes over and shuts down the body’s systems, according to the CDC’s Morbidity and Mortality Weekly Report.

Other antibiotic options, had they been available, might have saved the Nevada woman. But the solution to the larger problem won’t be a new drug. It will have to be an entirely new approach to the diagnosis of infectious disease, to the use of antibiotics, and to the monitoring of antimicrobial resistance (AMR)—all enabled by new technology.

But that new technology is not being implemented fast enough to prevent what former CDC director Tom Frieden has nicknamed nightmare bacteria. And the nightmare is becoming scarier by the year. A 2014 British study calculated that 700,000 people die globally each year because of AMR. By 2050, the global cost of antibiotic resistance could grow to 10 million deaths and US$100 trillion a year, according to a 2014 estimate. And the rate of AMR is growing exponentially, thanks to the speed with which humans serving as hosts for these nasty bugs can move among healthcare facilities—or countries. In the United States, for example, CRE had been seen only in North Carolina in 2000; today it’s nationwide.

Abuse and overuse of antibiotics in healthcare and livestock production have enabled bacteria to both mutate and acquire resistant genes from other organisms, resulting in truly pan-drug resistant organisms. As ever-more powerful superbugs continue to proliferate, we are potentially facing the deadliest and most costly human-made catastrophe in modern times.

“Without urgent, coordinated action by many stakeholders, the world is headed for a post-antibiotic era, in which common infections and minor injuries which have been treatable for decades can once again kill,” said Dr. Keiji Fukuda, assistant director-general for health security for the World Health Organization (WHO).

Even if new antibiotics could solve the problem, there are obstacles to their development. For one thing, antibiotics have complex molecular structures, which slows the discovery process. Further, they aren’t terribly lucrative for pharmaceutical manufacturers: public health concerns call for new antimicrobials to be financially accessible to patients and used conservatively precisely because of the AMR issue, which reduces the financial incentives to create new compounds. The last entirely new class of antibiotic was introduced 30 year ago. Finally, bacteria will develop resistance to new antibiotics as well if we don’t adopt new approaches to using them.

Technology can play the lead role in heading off this disaster. Vast amounts of data from multiple sources are required for better decision making at all points in the process, from tracking or predicting antibiotic-resistant disease outbreaks to speeding the potential discovery of new antibiotic compounds. However, microbes will quickly adapt and resist new medications, too, if we don’t also employ systems that help doctors diagnose and treat infection in a more targeted and judicious way.

Indeed, digital tools can help in all four actions that the CDC recommends for combating AMR: preventing infections and their spread, tracking resistance patterns, improving antibiotic use, and developing new diagnostics and treatment.

Meanwhile, individuals who understand both the complexities of AMR and the value of technologies like machine learning, human-computer interaction (HCI), and mobile applications are working to develop and advocate for solutions that could save millions of lives.

Keeping an Eye Out for Outbreaks

Like others who are leading the fight against AMR, Dr. Steven Solomon has no illusions about the difficulty of the challenge. “It is the single most complex problem in all of medicine and public health—far outpacing the complexity and the difficulty of any other problem that we face,” says Solomon, who is a global health consultant and former director of the CDC’s Office of Antimicrobial Resistance.

Solomon wants to take the battle against AMR beyond the laboratory. In his view, surveillance—tracking and analyzing various data on AMR—is critical, particularly given how quickly and widely it spreads. But surveillance efforts are currently fraught with shortcomings. The available data is fragmented and often not comparable. Hospitals fail to collect the representative samples necessary for surveillance analytics, collecting data only on those patients who experience resistance and not on those who get better. Laboratories use a wide variety of testing methods, and reporting is not always consistent or complete.

Surveillance can serve as an early warning system. But weaknesses in these systems have caused public health officials to consistently underestimate the impact of AMR in loss of lives and financial costs. That’s why improving surveillance must be a top priority, says Solomon, who previously served as chair of the U.S. Federal Interagency Task Force on AMR and has been tracking the advance of AMR since he joined the U.S. Public Health Service in 1981.

A Collaborative Diagnosis

Ineffective surveillance has also contributed to huge growth in the use of antibiotics when they aren’t warranted. Strong patient demand and financial incentives for prescribing physicians are blamed for antibiotics abuse in China. India has become the largest consumer of antibiotics on the planet, in part because they are prescribed or sold for diarrheal diseases and upper respiratory infections for which they have limited value. And many countries allow individuals to purchase antibiotics over the counter, exacerbating misuse and overuse.

In the United States, antibiotics are improperly prescribed 50% of the time, according to CDC estimates. One study of adult patients visiting U.S. doctors to treat respiratory problems found that more than two-thirds of antibiotics were prescribed for conditions that were not infections at all or for infections caused by viruses—for which an antibiotic would do nothing. That’s 27 million courses of antibiotics wasted a year—just for respiratory problems—in the United States alone.

And even in countries where there are national guidelines for prescribing antibiotics, those guidelines aren’t always followed. A study published in medical journal Family Practice showed that Swedish doctors, both those trained in Sweden and those trained abroad, inconsistently followed rules for prescribing antibiotics.

Solomon strongly believes that, worldwide, doctors need to expand their use of technology in their offices or at the bedside to guide them through a more rational approach to antibiotic use. Doctors have traditionally been reluctant to adopt digital technologies, but Solomon thinks that the AMR crisis could change that. New digital tools could help doctors and hospitals integrate guidelines for optimal antibiotic prescribing into their everyday treatment routines.

“Human-computer interactions are critical, as the amount of information available on antibiotic resistance far exceeds the ability of humans to process it,” says Solomon. “It offers the possibility of greatly enhancing the utility of computer-assisted physician order entry (CPOE), combined with clinical decision support.” Healthcare facilities could embed relevant information and protocols at the point of care, guiding the physician through diagnosis and prescription and, as a byproduct, facilitating the collection and reporting of antibiotic use.

Cincinnati Children’s Hospital’s antibiotic stewardship division has deployed a software program that gathers information from electronic medical records, order entries, computerized laboratory and pathology reports, and more. The system measures baseline antimicrobial use, dosing, duration, costs, and use patterns. It also analyzes bacteria and trends in their susceptibilities and helps with clinical decision making and prescription choices. The goal, says Dr. David Haslam, who heads the program, is to decrease the use of “big gun” super antibiotics in favor of more targeted treatment.

While this approach is not yet widespread, there is consensus that incorporating such clinical-decision support into electronic health records will help improve quality of care, contain costs, and reduce overtreatment in healthcare overall—not just in AMR. A 2013 randomized clinical trial finds that doctors who used decision-support tools were significantly less likely to order antibiotics than those in the control group and prescribed 50% fewer broad-spectrum antibiotics.

Putting mobile devices into doctors’ hands could also help them accept decision support, believes Solomon. Last summer, Scotland’s National Health Service developed an antimicrobial companion app to give practitioners nationwide mobile access to clinical guidance, as well as an audit tool to support boards in gathering data for local and national use.

“The immediacy and the consistency of the input to physicians at the time of ordering antibiotics may significantly help address the problem of overprescribing in ways that less-immediate interventions have failed to do,” Solomon says. In addition, handheld devices with so-called lab-on-a-chip  technology could be used to test clinical specimens at the bedside and transmit the data across cellular or satellite networks in areas where infrastructure is more limited.

Artificial intelligence (AI) and machine learning can also become invaluable technology collaborators to help doctors more precisely diagnose and treat infection. In such a system, “the physician and the AI program are really ‘co-prescribing,’” says Solomon. “The AI can handle so much more information than the physician and make recommendations that can incorporate more input on the type of infection, the patient’s physiologic status and history, and resistance patterns of recent isolates in that ward, in that hospital, and in the community.”

Speed Is Everything

Growing bacteria in a dish has never appealed to Dr. James Davis, a computational biologist with joint appointments at Argonne National Laboratory and the University of Chicago Computation Institute. The first of a growing breed of computational biologists, Davis chose a PhD advisor in 2004 who was steeped in bioinformatics technology “because you could see that things were starting to change,” he says. He was one of the first in his microbiology department to submit a completely “dry” dissertation—that is, one that was all digital with nothing grown in a lab.

Upon graduation, Davis wanted to see if it was possible to predict whether an organism would be susceptible or resistant to a given antibiotic, leading him to explore the potential of machine learning to predict AMR.

As the availability of cheap computing power has gone up and the cost of genome sequencing has gone down, it has become possible to sequence a pathogen sample in order to detect its AMR resistance mechanisms. This could allow doctors to identify the nature of an infection in minutes instead of hours or days, says Davis.

Davis is part of a team creating a giant database of bacterial genomes with AMR metadata for the Pathosystems Resource Integration Center (PATRIC), funded by the U.S. National Institute of Allergy and Infectious Diseases to collect data on priority pathogens, such as tuberculosis and gonorrhea.

Because the current inability to identify microbes quickly is one of the biggest roadblocks to making an accurate diagnosis, the team’s work is critically important. The standard method for identifying drug resistance is to take a sample from a wound, blood, or urine and expose the resident bacteria to various antibiotics. If the bacterial colony continues to divide and thrive despite the presence of a normally effective drug, it indicates resistance. The process typically takes between 16 and 20 hours, itself an inordinate amount of time in matters of life and death. For certain strains of antibiotic-resistant tuberculosis, though, such testing can take a week. While physicians are waiting for test results, they often prescribe broad-spectrum antibiotics or make a best guess about what drug will work based on their knowledge of what’s happening in their hospital, “and in the meantime, you either get better,” says Davis, “or you don’t.”

At PATRIC, researchers are using machine-learning classifiers to identify regions of the genome involved in antibiotic resistance that could form the foundation for a “laboratory free” process for predicting resistance. Being able to identify the genetic mechanisms of AMR and predict the behavior of bacterial pathogens without petri dishes could inform clinical decision making and improve reaction time. Thus far, the researchers have developed machine-learning classifiers for identifying antibiotic resistance in Acinetobacter baumannii (a big player in hospital-acquired infection), methicillin-resistant Staphylococcus aureus (a.k.a. MRSA, a worldwide problem), and Streptococcus pneumoniae (a leading cause of bacterial meningitis), with accuracies ranging from 88% to 99%.

Houston Methodist Hospital, which uses the PATRIC database, is researching multidrug-resistant bacteria, specifically MRSA. Not only does resistance increase the cost of care, but people with MRSA are 64% more likely to die than people with a nonresistant form of the infection, according to WHO. Houston Methodist is investigating the molecular genetic causes of drug resistance in MRSA in order to identify new treatment approaches and help develop novel antimicrobial agents.

The Hunt for a New Class of Antibiotics

There are antibiotic-resistant bacteria, and then there’s Clostridium difficile—a.k.a. C. difficile—a bacterium that attacks the intestines even in young and healthy patients in hospitals after the use of antibiotics.

It is because of C. difficile that Dr. L. Clifford McDonald jumped into the AMR fight. The epidemiologist was finishing his work analyzing the spread of SARS in Toronto hospitals in 2004 when he turned his attention to C. difficile, convinced that the bacteria would become more common and more deadly. He was right, and today he’s at the forefront of treating the infection and preventing the spread of AMR as senior advisor for science and integrity in the CDC’s Division of Healthcare Quality Promotion. “[AMR] is an area that we’re funding heavily…insofar as the CDC budget can fund anything heavily,” says McDonald, whose group has awarded $14 million in contracts for innovative anti-AMR approaches.

Developing new antibiotics is a major part of the AMR battle. The majority of new antibiotics developed in recent years have been variations of existing drug classes. It’s been three decades since the last new class of antibiotics was introduced. Less than 5% of venture capital in pharmaceutical R&D is focused on antimicrobial development. A 2008 study found that less than 10% of the 167 antibiotics in development at the time had a new “mechanism of action” to deal with multidrug resistance. “The low-hanging fruit [of antibiotic development] has been picked,” noted a WHO report.

Researchers will have to dig much deeper to develop novel medicines. Machine learning could help drug developers sort through much larger data sets and go about the capital-intensive drug development process in a more prescriptive fashion, synthesizing those molecules most likely to have an impact.

McDonald believes that it will become easier to find new antibiotics if we gain a better understanding of the communities of bacteria living in each of us—as many as 1,000 different types of microbes live in our intestines, for example. Disruption to those microbial communities—our “microbiome”—can herald AMR. McDonald says that Big Data and machine learning will be needed to unlock our microbiomes, and that’s where much of the medical community’s investment is going.

He predicts that within five years, hospitals will take fecal samples or skin swabs and sequence the microorganisms in them as a kind of pulse check on antibiotic resistance. “Just doing the bioinformatics to sort out what’s there and the types of antibiotic resistance that might be in that microbiome is a Big Data challenge,” McDonald says. “The only way to make sense of it, going forward, will be advanced analytic techniques, which will no doubt include machine learning.”

Reducing Resistance on the Farm

Bringing information closer to where it’s needed could also help reduce agriculture’s contribution to the antibiotic resistance problem. Antibiotics are widely given to livestock to promote growth or prevent disease. In the United States, more kilograms of antibiotics are administered to animals than to people, according to data from the FDA.

One company has developed a rapid, on-farm diagnostics tool to provide livestock producers with more accurate disease detection to make more informed management and treatment decisions, which it says has demonstrated a 47% to 59% reduction in antibiotic usage. Such systems, combined with pressure or regulations to reduce antibiotic use in meat production, could also help turn the AMR tide.

Breaking Down Data Silos Is the First Step

Adding to the complexity of the fight against AMR is the structure and culture of the global healthcare system itself. Historically, healthcare has been a siloed industry, notorious for its scattered approach focused on transactions rather than healthy outcomes or the true value of treatment. There’s no definitive data on the impact of AMR worldwide; the best we can do is infer estimates from the information that does exist.

The biggest issue is the availability of good data to share through mobile solutions, to drive HCI clinical-decision support tools, and to feed supercomputers and machine-learning platforms. “We have a fragmented healthcare delivery system and therefore we have fragmented information. Getting these sources of data all into one place and then enabling them all to talk to each other has been problematic,” McDonald says.

Collecting, integrating, and sharing AMR-related data on a national and ultimately global scale will be necessary to better understand the issue. HCI and mobile tools can help doctors, hospitals, and public health authorities collect more information while advanced analytics, machine learning, and in-memory computing can enable them to analyze that data in close to real time. As a result, we’ll better understand patterns of resistance from the bedside to the community and up to national and international levels, says Solomon. The good news is that new technology capabilities like AI and new potential streams of data are coming online as an era of data sharing in healthcare is beginning to dawn, adds McDonald.

The ideal goal is a digitally enabled virtuous cycle of information and treatment that could save millions of dollars, lives, and perhaps even civilization if we can get there. D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Dr. David Delaney is Chief Medical Officer for SAP.

Joseph Miles is Global Vice President, Life Sciences, for SAP.

Walt Ellenberger is Senior Director Business Development, Healthcare Transformation and Innovation, for SAP.

Saravana Chandran is Senior Director, Advanced Analytics, for SAP.

Stephanie Overby is an independent writer and editor focused on the intersection of business and technology.

Comments

Tags:

4 Traits Set Digital Leaders Apart From 97% Of The Competition

Vivek Bapat

Like the classic parable of the blind man and the elephant, it seems everyone has a unique take on digital transformation. Some equate digital transformation with emerging technologies, placing their bets on as the Internet of Things, machine learning, and artificial intelligence. Others see it as a way to increase efficiencies and change business processes to accelerate product to market. Some others think of it is a means of strategic differentiation, innovating new business models for serving and engaging their customers. Despite the range of viewpoints, many businesses are still challenged with pragmatically evolving digital in ways that are meaningful, industry-disruptive, and market-leading.

According to a recent study of more than 3,000 senior executives across 17 countries and regions, only a paltry three percent of businesses worldwide have successfully completed enterprise-wide digital transformation initiatives, even though 84% of C-level executives ranks such efforts as “critically important” to the fundamental sustenance of their business.

The most comprehensive global study of its kind, the SAP Center for Business Insight report “SAP Digital Transformation Executive Study: 4 Ways Leaders Set Themselves Apart,” in collaboration with Oxford Economics, identified the challenges, opportunities, value, and key technologies driving digital transformation. The findings specifically analyzed the performance of “digital leaders” – those who are connecting people, things, and businesses more intelligently, more effectively, and creating punctuated change faster than their less advanced rivals.

After analyzing the data, it was eye-opening to see that only three percent of companies (top 100) are successfully realizing their full potential through digital transformation. However, even more remarkable was that these leaders have four fundamental traits in common, regardless of their region of operation, their size, their organizational structure, or their industry.

We distilled these traits in the hope that others in the early stages of transformation or that are still struggling to find their bearings can embrace these principles in order to succeed. Ultimately I see these leaders as true ambidextrous organizations, managing evolutionary and revolutionary change simultaneously, willing to embrace innovation – not just on the edges of their business, but firmly into their core.

Here are the four traits that set these leaders apart from the rest:

Trait #1: They see digital transformation as truly transformational

An overwhelming majority (96%) of digital leaders view digital transformation as a core business goal that requires a unified digital mindset across the entire enterprise. But instead of allowing individual functions to change at their own pace, digital leaders prefer to evolve the organization to help ensure the success of their digital strategies.

The study found that 56% of these businesses regularly shift their organizational structure, which includes processes, partners, suppliers, and customers, compared to 10% of remaining companies. Plus, 70% actively bring lines of business together through cross-functional processes and technologies.

By creating a firm foundation for transformation, digital leaders are further widening the gap between themselves and their less advanced competitors as they innovate business models that can mitigate emerging risks and seize new opportunities quickly.

Trait #2: They focus on transforming customer-facing functions first

Although most companies believe technology, the pace of change, and growing global competition are the key global trends that will affect everything for years to come, digital leaders are expanding their frame of mind to consider the influence of customer empowerment. Executives who build a momentum of breakthrough innovation and industry transformation are the ones that are moving beyond the high stakes of the market to the activation of complete, end-to-end customer experiences.

In fact, 92% of digital leaders have established sophisticated digital transformation strategies and processes to drive transformational change in customer satisfaction and engagement, compared to 22% of their less mature counterparts. As a result, 70% have realized significant or transformational value from these efforts.

Trait #3: They create a virtuous cycle of digital talent

There’s little doubt that the competition for qualified talent is fierce. But for nearly three-quarters of companies that demonstrate digital-transformation leadership, it is easier to attract and retain talent because they are five times more likely to leverage digitization to change their talent management efforts.

The impact of their efforts goes beyond empowering recruiters to identify best-fit candidates, highlight risk factors and hiring errors, and predict long-term talent needs. Nearly half (48%) of digital leaders understand that they must invest heavily in the development of digital skills and technology to drive revenue, retain productive employees, and create new roles to keep up with their digital maturity over the next two years, compared to 30% of all surveyed executives.

Trait #4: They invest in next-generation technology using a bimodal architecture

A couple years ago, Peter Sondergaard, senior vice president at Gartner and global head of research, observed that “CIOs can’t transform their old IT organization into a digital startup, but they can turn it into a bi-modal IT organization. Forty-five percent of CIOs state they currently have a fast mode of operation, and we predict that 75% of IT organizations will be bimodal in some way by 2017.”

Based on the results of the SAP Center for Business Insight study, Sondergaard’s prediction was spot on. As digital leaders dive into advanced technologies, 72% are using a digital twin of the conventional IT organization to operate efficiently without disruption while refining innovative scenarios to resolve business challenges and integrate them to stay ahead of the competition. Unfortunately, only 30% of less advanced businesses embrace this view.

Working within this bimodal architecture is emboldening digital leaders to take on incredibly progressive technology. For example, the study found that 50% of these firms are using artificial intelligence and machine learning, compared to seven percent of all respondents. They are also leading the adoption curve of Big Data solutions and analytics (94% vs. 60%) and the Internet of Things (76% vs. 52%).

Digital leadership is a practice of balance, not pure digitization

Most executives understand that digital transformation is a critical driver of revenue growth, profitability, and business expansion. However, as digital leaders are proving, digital strategies must deliver a balance of organizational flexibility, forward-looking technology adoption, and bold change. And clearly, this approach is paying dividends for them. They are growing market share, increasing customer satisfaction, improving employee engagement, and, perhaps more important, achieving more profitability than ever before.

For any company looking to catch up to digital leaders, the conversation around digital transformation needs to change immediately to combat three deadly sins: Stop investing in one-off, isolated projects hidden in a single organization. Stop viewing IT as an enabler instead of a strategic partner. Stop walling off the rest of the business from siloed digital successes.

As our study shows, companies that treat their digital transformation as an all-encompassing, all-sharing, and all-knowing business imperative will be the ones that disrupt the competitive landscape and stay ahead of a constantly evolving economy.

Follow me on twitter @vivek_bapat 

For more insight on digital leaders, check out the SAP Center for Business Insight report, conducted in collaboration with Oxford Economics,SAP Digital Transformation Executive Study: 4 Ways Leaders Set Themselves Apart.”

Comments

About Vivek Bapat

Vivek Bapat is the Senior Vice President, Global Head of Marketing Strategy and Thought Leadership, at SAP. He leads SAP's Global Marketing Strategy, Messaging, Positioning and related Thought Leadership initiatives.