Sections

2013 Cloud Predictions [Feature]

Jen Cohen Crompton

As 2013 approaches, businesses are preparing for the new year by allocating budgets, reorganizing resources, and launching new (or optimizing existing) strategies.

During this process, a consideration for many CTOs, IT Departments, and those working to create and manage an efficient [and for most, cost-effective] virtualization infrastructure, is how to integrate and/or best use cloud technology options.

To prepare for the transition into the new year and better understand the direction cloud technology advancements will shift, here are some bold predictions from industry experts who have a real sense of what is to come in 2013. Find out what they are predicting and why it will happen next year.

Prediction One: The Cloud Will Become a Necessity, Not an Option

Cloud adoption experienced quite a boost in 2012. In a July 2012 article, the Wall Street Journal reported that, “[A] survey, conducted by IT industry association CompTIA, found that more than eight in 10 companies use some form of cloud technology.” With the rise in understanding the importance of collecting big data, learning to analyze and manage big data, and the increase of employees wanting to work outside the confines of a desk and the four walls of an office, cloud adoption is gaining ground…and it will continue to increase adoption rate in 2013, making it a necessity.

Those who adopted cloud technology strategies found that the flexibility of cloud technology provided a cost-effective solution to increase storage space as their big structured and unstructured data needed a place to rest. The massive amounts of data were deemed important, but as social media and ecommerce data became more robust, the data sets became larger files requiring additional bandwidth. As Dick Csaplar, Senior Research Analyst, Virtualization and the Cloud, Aberdeen Group, points out, “A survey on cloud storage conducted by Aberdeen in June of 2012 found that the amount of data stored by companies grew at an average rate of 35 percent per year.” Imagine how much data companies will have by the end of next year.

Businesses also experienced a continued uptick of employees who prefer working from home and/or anywhere outside the office rather than being hardwired into the systems. The shift of the corporate structure (using contractors and satellite employees) required flawless accessibility to files and simplified collaboration to lead to quicker idea exchange and business innovation – all problems solved by a cloud.

Prediction Two: BYOD (Bring Your Own Device) Policies Will Be Ubiquitous

BYOD was on the rise in 2012 with a growing percentage of large corporations allowing employees to use their own mobile devices for company related purposes. The reasons are attributed to individuals upgrading their devices quicker than corporations and wanting to have the flexibility to have access to files without being in the office. Companies, while still cautious with this concept, are becoming more accepting and implementing BYOD policies to ensure their confidential information will not be compromised. Even the federal government provides a working toolkit to help federal agencies craft a BYOD policy to ensure all bases are covered.

As Bill Rosenthal, CEO of Logical Operations, Inc., points out, “With more work being done away from the office, employees who can access the company’s systems with their own devices will be more productive,” so the positives of this allowance (increased productivity and a quicker flow of communication) are outweighing the risks.

The issue here is no longer, “Should we allow?” Instead, it’s, “How do we protect and ensure security during employment and beyond employment?” The answer is a defined, specific, and legally approved policy, which is why they will become ubiquitous.

Prediction Three: Hackers Will Hack On in An Epic Way

The most overwhelmingly similar prediction collected from most experts was that there would be a cyberattack targeting cloud services and highly-sensitive and confidential information. Dave Jevans, Founder and CTO of MarbleCloud, believes that the cyberattacks will come from hostile governments and that hackers will implement phishing sites and malware with the goal of stealing data and cloud passwords from individuals’ computers.

Zack Shuler, Founder and CEO of Cal Net Technology takes it one step further and adds that, “A major cloud vendor will be attacked by a foreign hack group leaving their cloud services unstable for days/weeks,” and this will result in a major distrust of the technology sparking a reconsideration of the type of information stored in the cloud. So, security will resurface as a major consideration.

And since the beauty of using cloud services lies in accessibility across devices, the attacks on cloud-based services will not only become a threat and potentially compromise data, but they will also affect mobile devices as mobile malware attacks increase. Ellie Bitton, Senior Director of Product Management-Virtualization at Fortinet points out, “As more businesses migrate to cloud-based services, cyber criminals next year will find a way to compromise them.” Bitton points to an example of Android’s Cloud to Device Messaging that suffered a malware attack, which monitored incoming and outgoing messages without the owner’s knowledge.

So keep your employees close and your hackers closer.

Prediction Four: The Public vs. Private vs. Hybrid Cloud Debate Continues…

There seems to be no end in site for the debate about which cloud solution is better, and that’s a good thing since each can be designed for a difference purpose allowing some flexibility in the design of the implemented cloud technology.

Part of the concern with the public vs. private cloud is that most organizations aren’t quite sure what this means and the implications of the decision to go in either direction. Due to privacy issues concerning enterprise data, companies that can afford it, generally opt to create a customized private cloud and/or use a hybrid option where the private cloud stores the data and the public cloud provides the functionality and collaboration capabilities. As Virtustream Chief Marketing and Strategy Officer, Simon Aspinall believes, “As more enterprises look to move their legacy applications to the cloud, they will find a hybrid cloud to be the most suitable…it [the solution] will combine the scalability and savings of a public cloud with the security of a private cloud for compliance requirements and enable isolation of sensitive data.”

But, as private cloud options become more affordable and organizations see the value in this type of system, adoption for this type of cloud in lieu of the hybrid approach may take over. Vishal Awasthi, CTO at Dolphin predicts that, “we [will] see a trend where commoditized peripheral applications will continue to move into a multi-tenant true SaaS model, while core enterprise ‘system-of-records’ applications will remain on-premise, or at best utilize virtualization on private or hybrid cloud.”

Either way, Andrew Hay, Chief Evangelist at CloudPassage sees, “organizations investing substantial time deciding how to extend their current compute strategy to give them the required cloud capabilities, and some may even consider replacing their technology outright if a migration or augmentation path cannot be found.”

The search for the perfect solution will continue.

Prediction Five: IT Departments Will Change

Since more information will be stored in the cloud, the IT department will have a level of support, based on the cloud service provider, but will have to adjust their skills by learning the  ins and outs of the new technology and serving as a resource on how to support the technology. IT departments might see their staff working to craft and incorporate the necessary BYOD policies and train employees on cybersecurity, something that will become increasingly important.

The Vice President of Cloud Services at Cbeyond, Chris Ortbals,  also sees IT departments becoming more responsible for making informed decisions about the technology choices as companies may “bounce from cloud to cloud. IT leaders will have a better grasp on what to realistically expect from their cloud services…and the transition of the cloud from a bonus to an expectation, will result in lengthier negotiations. Those businesses that began working within the cloud during its early stages…will likely reach, the end of their first cloud contract in 2013.”

So there you have it, some of the top cloud predictions for 2013. Now if only those experts could predict which cloud service will be hacked, maybe they could predict some cloud technology superheroes and prove us all wrong!

Comments

About Jen Cohen Crompton

Jen Cohen Crompton is a SAP Blogging Correspondent reporting on big data, cloud computing, enterprise mobility, analytics, sports and tech, and anything else innovation-related. When she's not blogging, she can be caught marketing, using social media and/or presenting at conferences around the world. Disclosure: Jen is being compensated by SAP to produce a series of articles on the innovation topics covered on this site. The opinions reflected here are her own.

Innovation Without Boundaries: Why The Cloud Matters

Michael Haws

Is it possible to innovate without boundaries?

Of course – if you are using the cloud. An actual cloud doesn’t have any boundaries. It’s fluid. But more important, it can provide the much-needed precipitation that brings nature to life. So it is with cloud technology – but it’s your ideas that can grow and transform your business.USA --- Clouds, Heaven --- Image by © Ocean/Corbis

Running your business in the cloud is no longer just a consideration during a typical use-case exercise. Business executives are now faced with making decisions on solutions that go beyond previous limitations with cloud computing. Selecting the latest tools to address a business process gap is now less about features and more about functionality.

It doesn’t matter whether your organization is experienced with cloud solutions or new to the concept. Cloud technology is quickly becoming a core part of addressing the needs of a growing business.

5 considerations when planning your journey to the cloud

How can your organization define its successful path to the cloud? Here are five things you should consider when investigating whether a move to the cloud is right for you.

1. Understanding the cloud is great, but putting it into action is another thing.

For most CIOs, putting a cloud strategy on paper is new territory. Cloud computing is taking on new realms: Pure managed services to software-as-a-service (SaaS). Just as legacy computing had different flavors, so does cloud technology.

2. There is more than one way to innovate in the cloud.

Alignment with an open cloud reference architecture can help your CIO deliver on the promises of the cloud while using a stair-step approach to cloud adoption – from on-premise to hybrid to full cloud computing. Some companies find their own path by constantly reevaluating their needs and shifting their focus when necessary – making the move from running a data center to delivering real value to stakeholders, for example.

3. The cloud can help accelerate processes and lower cost.

By recognizing unprecedented growth, your organization can embark on a path to significant transformation that powers greater agility and competitiveness. Choose a solution set that best meets your needs, and implement and support it moving forward. By leveraging the cloud to support the chosen solution, ongoing maintenance, training, and system issues becomes the cloud provider’s responsibility. And for you, this offers the freedom to focus on the core business.

4. You can lock down your infrastructure and ensure more efficient processes.

Do you use a traditional reporting engine against a large relational database to generate a sequential batched report to close your books at quarter’s end? If so, you’re not alone. Sure, a new solution with new technology may be an obvious improvement. But how valuable to your board will you become when you reduce the financial closing process by 1–3 days? That’s the beauty of the cloud: You can accelerate the deployment of your chosen solution and realize ROI quickly – even before the next full reporting period.

5. The cloud opens the door to new opportunity in a secure environment.

For many companies, moving to the cloud may seem impossible due to the time and effort needed to train workers and hire resources with the right skill sets. Plus, if you are a startup in a rural location, it may not be as easy to attract the right talent as it is for your Silicon Valley counterparts. The cloud allows your business to secure your infrastructure as well as recruit and onboard those hard-to-find resources by applying a managed services contract to run your cloud model

The cloud means many things to different people. What’s your path?

With SAP HANA Enterprise Cloud service, you can navigate the best path to building, running, and operating your own cloud when running critical business processes. Find out how SAP HANA Enterprise Cloud can deliver the speed and resources necessary to quickly validate and realize solid ROI.

Check out the video below or visit us at www.sap.com/services-support/svc/in-memory-computing/hana-consulting/enterprise-cloud-services/index.html.

Connect with us on Twitter: @SAPServices

Comments

Michael Haws

About Michael Haws

Michael Haws is the Vice President of HANA Enterprise Cloud at SAP. His specialties include Enterprise Resource Planning Software & Services, Onshore, Nearshore, Offshore--Application, Infrastructure and Business Process Outsourcing.

Tags:

Consumers And Providers: Two Halves Of The Hybrid Cloud Equation

Marty McCormick

Long gone are the days of CIOs and IT managers freely spending money to move their 02 Jun 2012 --- Young creatives having lunch and conversation. --- Image by © Hero/Corbisexisting systems to the cloud without any real business justification just to be part of the latest hype. As cloud deployments are becoming more prevalent, IT leaders are now tasked with proving the tangible benefits of adopting a cloud strategy from an operational, efficiency, and cost perspective. At the same time, they must balance their end users’ increasing demand for access to more data from an ever-expanding list of public cloud sources.

Lately, public cloud systems have become part of IT landscapes both in the form of multi-tenant systems, such as software-as-a-service (SaaS) offerings and data consumption applications such as Twitter. Along with the integration of applications and data outside of the corporate domain, new architectures have been spawned, requiring real-time and seamless integration points.  As shown in the figure below, these hybrid clouds – loosely defined as the integration of data from systems in both public and private clouds in a unified fashion – are the foundation of this new IT architecture.

hybridCloudImage

Not only has the hybrid cloud changed a company’s approach to deploying new software, but it has also changed the way software is developed and sold from a provider’s perspective.

The provider perspective: Unifying development and operations

Thanks to the hybrid cloud approach, system administrators and developers are sitting side by side in an agile development model known as Development and Operations (DevOps). By increasing collaboration, communication, innovation, and problem resolution, development teams can closely collaborate with system administrators and provide a continuous feedback loop of both sides of the agile methodology.

For example, operations teams can provide feedback on reported software bugs, software support issues, and new feature requests to development teams in real time. Likewise, development teams develop and test new applications with support and maintainability as a key pillar in design.
After seeing the advantages realized by cloud providers that have embraced this approach long ago, other companies that have traditionally separated these two areas are now adopting the DevOps model.

The consumer perspective: Moving to the cloud on its own terms

From the standpoint of the corporate consumer, hybrid cloud deployments bring a number of advantages to an IT organization. Specifically, the hybrid approach allows companies to move some application functionality to the cloud at their own pace.
Many applications naturally lend themselves to public cloud domains given their application and data requirements. For most companies, HR, indirect procurement, travel, and CRM systems are the first to be deployed in a public cloud. This approach eliminates the requirement for building and operating these applications in house while allowing IT areas to take advantage of new features and technologies much faster.

However, there is one challenge consumers need to overcome: The lack of capabilities needed to extend these applications and meet business requirements when the standard offering is often insufficient. Unfortunately, this tempts organizations to create extensive custom applications that replicate information across a variety of systems to meet end user requirements. This development work can offset the cost benefits of the initial cloud application, especially when you consider the upgrades and support required to maintain the application.

What this all means to everyone involved in the hybrid cloud

Given these two perspectives, on-premise software providers are transforming themselves so they can meet the ever-evolving demands of today’s information consumer. In particular, they are preparing for these unique challenges facing customers and creating a smooth journey to a hybrid cloud.

Take SAP, for example. By adopting a DevOps model to break down a huge internal barrier and allowing tighter collaboration, the company has delivered a simpler approach to hybrid cloud deployments through the SAP HANA Cloud Platform for extending applications and SAP HANA Enterprise Cloud for hosting solutions.

Find out how these two innovations can help you implement a robust and secure hybrid cloud solution:
SAP HANA Cloud Platform
SAP HANA Enterprise Cloud

Comments

Marty McCormick

About Marty McCormick

Marty McCormick is the Lead Technical Architect, Managed Cloud Delivery, at SAP. He is experienced in a wide range of SAP solutions, including SAP Netweaver SAP Portal, SAP CRM, SAP SRM, SAP MDM, SAP BI, and SAP ERP.

Heroes in the Race to Save Antibiotics

Dr. David Delaney, Joseph Miles, Walt Ellenberger, Saravana Chandran, and Stephanie Overby

Last August, a woman arrived at a Reno, Nevada, hospital and told the attending doctors that she had recently returned from an extended trip to India, where she had broken her right thighbone two years ago. The woman, who was in her 70s, had subsequently developed an infection in her thigh and hip for which she was hospitalized in India several times. The Reno doctors recognized that the infection was serious—and the visit to India, where antibiotic-resistant bacteria runs rampant, raised red flags.

When none of the 14 antibiotics the physicians used to treat the woman worked, they sent a sample of the bacterium to the U.S. Centers for Disease Control (CDC) for testing. The CDC confirmed the doctors’ worst fears: the woman had a class of microbe called carbapenem-resistant Enterobacteriaceae (CRE). Carbapenems are a powerful class of antibiotics used as last-resort treatment for multidrug-resistant infections. The CDC further found that, in this patient’s case, the pathogen was impervious to all 26 antibiotics approved by the U.S. Food and Drug Administration (FDA).

In other words, there was no cure.

This is just the latest alarming development signaling the end of the road for antibiotics as we know them. In September, the woman died from septic shock, in which an infection takes over and shuts down the body’s systems, according to the CDC’s Morbidity and Mortality Weekly Report.

Other antibiotic options, had they been available, might have saved the Nevada woman. But the solution to the larger problem won’t be a new drug. It will have to be an entirely new approach to the diagnosis of infectious disease, to the use of antibiotics, and to the monitoring of antimicrobial resistance (AMR)—all enabled by new technology.

But that new technology is not being implemented fast enough to prevent what former CDC director Tom Frieden has nicknamed nightmare bacteria. And the nightmare is becoming scarier by the year. A 2014 British study calculated that 700,000 people die globally each year because of AMR. By 2050, the global cost of antibiotic resistance could grow to 10 million deaths and US$100 trillion a year, according to a 2014 estimate. And the rate of AMR is growing exponentially, thanks to the speed with which humans serving as hosts for these nasty bugs can move among healthcare facilities—or countries. In the United States, for example, CRE had been seen only in North Carolina in 2000; today it’s nationwide.

Abuse and overuse of antibiotics in healthcare and livestock production have enabled bacteria to both mutate and acquire resistant genes from other organisms, resulting in truly pan-drug resistant organisms. As ever-more powerful superbugs continue to proliferate, we are potentially facing the deadliest and most costly human-made catastrophe in modern times.

“Without urgent, coordinated action by many stakeholders, the world is headed for a post-antibiotic era, in which common infections and minor injuries which have been treatable for decades can once again kill,” said Dr. Keiji Fukuda, assistant director-general for health security for the World Health Organization (WHO).

Even if new antibiotics could solve the problem, there are obstacles to their development. For one thing, antibiotics have complex molecular structures, which slows the discovery process. Further, they aren’t terribly lucrative for pharmaceutical manufacturers: public health concerns call for new antimicrobials to be financially accessible to patients and used conservatively precisely because of the AMR issue, which reduces the financial incentives to create new compounds. The last entirely new class of antibiotic was introduced 30 year ago. Finally, bacteria will develop resistance to new antibiotics as well if we don’t adopt new approaches to using them.

Technology can play the lead role in heading off this disaster. Vast amounts of data from multiple sources are required for better decision making at all points in the process, from tracking or predicting antibiotic-resistant disease outbreaks to speeding the potential discovery of new antibiotic compounds. However, microbes will quickly adapt and resist new medications, too, if we don’t also employ systems that help doctors diagnose and treat infection in a more targeted and judicious way.

Indeed, digital tools can help in all four actions that the CDC recommends for combating AMR: preventing infections and their spread, tracking resistance patterns, improving antibiotic use, and developing new diagnostics and treatment.

Meanwhile, individuals who understand both the complexities of AMR and the value of technologies like machine learning, human-computer interaction (HCI), and mobile applications are working to develop and advocate for solutions that could save millions of lives.

Keeping an Eye Out for Outbreaks

Like others who are leading the fight against AMR, Dr. Steven Solomon has no illusions about the difficulty of the challenge. “It is the single most complex problem in all of medicine and public health—far outpacing the complexity and the difficulty of any other problem that we face,” says Solomon, who is a global health consultant and former director of the CDC’s Office of Antimicrobial Resistance.

Solomon wants to take the battle against AMR beyond the laboratory. In his view, surveillance—tracking and analyzing various data on AMR—is critical, particularly given how quickly and widely it spreads. But surveillance efforts are currently fraught with shortcomings. The available data is fragmented and often not comparable. Hospitals fail to collect the representative samples necessary for surveillance analytics, collecting data only on those patients who experience resistance and not on those who get better. Laboratories use a wide variety of testing methods, and reporting is not always consistent or complete.

Surveillance can serve as an early warning system. But weaknesses in these systems have caused public health officials to consistently underestimate the impact of AMR in loss of lives and financial costs. That’s why improving surveillance must be a top priority, says Solomon, who previously served as chair of the U.S. Federal Interagency Task Force on AMR and has been tracking the advance of AMR since he joined the U.S. Public Health Service in 1981.

A Collaborative Diagnosis

Ineffective surveillance has also contributed to huge growth in the use of antibiotics when they aren’t warranted. Strong patient demand and financial incentives for prescribing physicians are blamed for antibiotics abuse in China. India has become the largest consumer of antibiotics on the planet, in part because they are prescribed or sold for diarrheal diseases and upper respiratory infections for which they have limited value. And many countries allow individuals to purchase antibiotics over the counter, exacerbating misuse and overuse.

In the United States, antibiotics are improperly prescribed 50% of the time, according to CDC estimates. One study of adult patients visiting U.S. doctors to treat respiratory problems found that more than two-thirds of antibiotics were prescribed for conditions that were not infections at all or for infections caused by viruses—for which an antibiotic would do nothing. That’s 27 million courses of antibiotics wasted a year—just for respiratory problems—in the United States alone.

And even in countries where there are national guidelines for prescribing antibiotics, those guidelines aren’t always followed. A study published in medical journal Family Practice showed that Swedish doctors, both those trained in Sweden and those trained abroad, inconsistently followed rules for prescribing antibiotics.

Solomon strongly believes that, worldwide, doctors need to expand their use of technology in their offices or at the bedside to guide them through a more rational approach to antibiotic use. Doctors have traditionally been reluctant to adopt digital technologies, but Solomon thinks that the AMR crisis could change that. New digital tools could help doctors and hospitals integrate guidelines for optimal antibiotic prescribing into their everyday treatment routines.

“Human-computer interactions are critical, as the amount of information available on antibiotic resistance far exceeds the ability of humans to process it,” says Solomon. “It offers the possibility of greatly enhancing the utility of computer-assisted physician order entry (CPOE), combined with clinical decision support.” Healthcare facilities could embed relevant information and protocols at the point of care, guiding the physician through diagnosis and prescription and, as a byproduct, facilitating the collection and reporting of antibiotic use.

Cincinnati Children’s Hospital’s antibiotic stewardship division has deployed a software program that gathers information from electronic medical records, order entries, computerized laboratory and pathology reports, and more. The system measures baseline antimicrobial use, dosing, duration, costs, and use patterns. It also analyzes bacteria and trends in their susceptibilities and helps with clinical decision making and prescription choices. The goal, says Dr. David Haslam, who heads the program, is to decrease the use of “big gun” super antibiotics in favor of more targeted treatment.

While this approach is not yet widespread, there is consensus that incorporating such clinical-decision support into electronic health records will help improve quality of care, contain costs, and reduce overtreatment in healthcare overall—not just in AMR. A 2013 randomized clinical trial finds that doctors who used decision-support tools were significantly less likely to order antibiotics than those in the control group and prescribed 50% fewer broad-spectrum antibiotics.

Putting mobile devices into doctors’ hands could also help them accept decision support, believes Solomon. Last summer, Scotland’s National Health Service developed an antimicrobial companion app to give practitioners nationwide mobile access to clinical guidance, as well as an audit tool to support boards in gathering data for local and national use.

“The immediacy and the consistency of the input to physicians at the time of ordering antibiotics may significantly help address the problem of overprescribing in ways that less-immediate interventions have failed to do,” Solomon says. In addition, handheld devices with so-called lab-on-a-chip  technology could be used to test clinical specimens at the bedside and transmit the data across cellular or satellite networks in areas where infrastructure is more limited.

Artificial intelligence (AI) and machine learning can also become invaluable technology collaborators to help doctors more precisely diagnose and treat infection. In such a system, “the physician and the AI program are really ‘co-prescribing,’” says Solomon. “The AI can handle so much more information than the physician and make recommendations that can incorporate more input on the type of infection, the patient’s physiologic status and history, and resistance patterns of recent isolates in that ward, in that hospital, and in the community.”

Speed Is Everything

Growing bacteria in a dish has never appealed to Dr. James Davis, a computational biologist with joint appointments at Argonne National Laboratory and the University of Chicago Computation Institute. The first of a growing breed of computational biologists, Davis chose a PhD advisor in 2004 who was steeped in bioinformatics technology “because you could see that things were starting to change,” he says. He was one of the first in his microbiology department to submit a completely “dry” dissertation—that is, one that was all digital with nothing grown in a lab.

Upon graduation, Davis wanted to see if it was possible to predict whether an organism would be susceptible or resistant to a given antibiotic, leading him to explore the potential of machine learning to predict AMR.

As the availability of cheap computing power has gone up and the cost of genome sequencing has gone down, it has become possible to sequence a pathogen sample in order to detect its AMR resistance mechanisms. This could allow doctors to identify the nature of an infection in minutes instead of hours or days, says Davis.

Davis is part of a team creating a giant database of bacterial genomes with AMR metadata for the Pathosystems Resource Integration Center (PATRIC), funded by the U.S. National Institute of Allergy and Infectious Diseases to collect data on priority pathogens, such as tuberculosis and gonorrhea.

Because the current inability to identify microbes quickly is one of the biggest roadblocks to making an accurate diagnosis, the team’s work is critically important. The standard method for identifying drug resistance is to take a sample from a wound, blood, or urine and expose the resident bacteria to various antibiotics. If the bacterial colony continues to divide and thrive despite the presence of a normally effective drug, it indicates resistance. The process typically takes between 16 and 20 hours, itself an inordinate amount of time in matters of life and death. For certain strains of antibiotic-resistant tuberculosis, though, such testing can take a week. While physicians are waiting for test results, they often prescribe broad-spectrum antibiotics or make a best guess about what drug will work based on their knowledge of what’s happening in their hospital, “and in the meantime, you either get better,” says Davis, “or you don’t.”

At PATRIC, researchers are using machine-learning classifiers to identify regions of the genome involved in antibiotic resistance that could form the foundation for a “laboratory free” process for predicting resistance. Being able to identify the genetic mechanisms of AMR and predict the behavior of bacterial pathogens without petri dishes could inform clinical decision making and improve reaction time. Thus far, the researchers have developed machine-learning classifiers for identifying antibiotic resistance in Acinetobacter baumannii (a big player in hospital-acquired infection), methicillin-resistant Staphylococcus aureus (a.k.a. MRSA, a worldwide problem), and Streptococcus pneumoniae (a leading cause of bacterial meningitis), with accuracies ranging from 88% to 99%.

Houston Methodist Hospital, which uses the PATRIC database, is researching multidrug-resistant bacteria, specifically MRSA. Not only does resistance increase the cost of care, but people with MRSA are 64% more likely to die than people with a nonresistant form of the infection, according to WHO. Houston Methodist is investigating the molecular genetic causes of drug resistance in MRSA in order to identify new treatment approaches and help develop novel antimicrobial agents.

The Hunt for a New Class of Antibiotics

There are antibiotic-resistant bacteria, and then there’s Clostridium difficile—a.k.a. C. difficile—a bacterium that attacks the intestines even in young and healthy patients in hospitals after the use of antibiotics.

It is because of C. difficile that Dr. L. Clifford McDonald jumped into the AMR fight. The epidemiologist was finishing his work analyzing the spread of SARS in Toronto hospitals in 2004 when he turned his attention to C. difficile, convinced that the bacteria would become more common and more deadly. He was right, and today he’s at the forefront of treating the infection and preventing the spread of AMR as senior advisor for science and integrity in the CDC’s Division of Healthcare Quality Promotion. “[AMR] is an area that we’re funding heavily…insofar as the CDC budget can fund anything heavily,” says McDonald, whose group has awarded $14 million in contracts for innovative anti-AMR approaches.

Developing new antibiotics is a major part of the AMR battle. The majority of new antibiotics developed in recent years have been variations of existing drug classes. It’s been three decades since the last new class of antibiotics was introduced. Less than 5% of venture capital in pharmaceutical R&D is focused on antimicrobial development. A 2008 study found that less than 10% of the 167 antibiotics in development at the time had a new “mechanism of action” to deal with multidrug resistance. “The low-hanging fruit [of antibiotic development] has been picked,” noted a WHO report.

Researchers will have to dig much deeper to develop novel medicines. Machine learning could help drug developers sort through much larger data sets and go about the capital-intensive drug development process in a more prescriptive fashion, synthesizing those molecules most likely to have an impact.

McDonald believes that it will become easier to find new antibiotics if we gain a better understanding of the communities of bacteria living in each of us—as many as 1,000 different types of microbes live in our intestines, for example. Disruption to those microbial communities—our “microbiome”—can herald AMR. McDonald says that Big Data and machine learning will be needed to unlock our microbiomes, and that’s where much of the medical community’s investment is going.

He predicts that within five years, hospitals will take fecal samples or skin swabs and sequence the microorganisms in them as a kind of pulse check on antibiotic resistance. “Just doing the bioinformatics to sort out what’s there and the types of antibiotic resistance that might be in that microbiome is a Big Data challenge,” McDonald says. “The only way to make sense of it, going forward, will be advanced analytic techniques, which will no doubt include machine learning.”

Reducing Resistance on the Farm

Bringing information closer to where it’s needed could also help reduce agriculture’s contribution to the antibiotic resistance problem. Antibiotics are widely given to livestock to promote growth or prevent disease. In the United States, more kilograms of antibiotics are administered to animals than to people, according to data from the FDA.

One company has developed a rapid, on-farm diagnostics tool to provide livestock producers with more accurate disease detection to make more informed management and treatment decisions, which it says has demonstrated a 47% to 59% reduction in antibiotic usage. Such systems, combined with pressure or regulations to reduce antibiotic use in meat production, could also help turn the AMR tide.

Breaking Down Data Silos Is the First Step

Adding to the complexity of the fight against AMR is the structure and culture of the global healthcare system itself. Historically, healthcare has been a siloed industry, notorious for its scattered approach focused on transactions rather than healthy outcomes or the true value of treatment. There’s no definitive data on the impact of AMR worldwide; the best we can do is infer estimates from the information that does exist.

The biggest issue is the availability of good data to share through mobile solutions, to drive HCI clinical-decision support tools, and to feed supercomputers and machine-learning platforms. “We have a fragmented healthcare delivery system and therefore we have fragmented information. Getting these sources of data all into one place and then enabling them all to talk to each other has been problematic,” McDonald says.

Collecting, integrating, and sharing AMR-related data on a national and ultimately global scale will be necessary to better understand the issue. HCI and mobile tools can help doctors, hospitals, and public health authorities collect more information while advanced analytics, machine learning, and in-memory computing can enable them to analyze that data in close to real time. As a result, we’ll better understand patterns of resistance from the bedside to the community and up to national and international levels, says Solomon. The good news is that new technology capabilities like AI and new potential streams of data are coming online as an era of data sharing in healthcare is beginning to dawn, adds McDonald.

The ideal goal is a digitally enabled virtuous cycle of information and treatment that could save millions of dollars, lives, and perhaps even civilization if we can get there. D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Dr. David Delaney is Chief Medical Officer for SAP.

Joseph Miles is Global Vice President, Life Sciences, for SAP.

Walt Ellenberger is Senior Director Business Development, Healthcare Transformation and Innovation, for SAP.

Saravana Chandran is Senior Director, Advanced Analytics, for SAP.

Stephanie Overby is an independent writer and editor focused on the intersection of business and technology.

Comments

Tags:

4 Traits Set Digital Leaders Apart From 97% Of The Competition

Vivek Bapat

Like the classic parable of the blind man and the elephant, it seems everyone has a unique take on digital transformation. Some equate digital transformation with emerging technologies, placing their bets on as the Internet of Things, machine learning, and artificial intelligence. Others see it as a way to increase efficiencies and change business processes to accelerate product to market. Some others think of it is a means of strategic differentiation, innovating new business models for serving and engaging their customers. Despite the range of viewpoints, many businesses are still challenged with pragmatically evolving digital in ways that are meaningful, industry-disruptive, and market-leading.

According to a recent study of more than 3,000 senior executives across 17 countries and regions, only a paltry three percent of businesses worldwide have successfully completed enterprise-wide digital transformation initiatives, even though 84% of C-level executives ranks such efforts as “critically important” to the fundamental sustenance of their business.

The most comprehensive global study of its kind, the SAP Center for Business Insight report “SAP Digital Transformation Executive Study: 4 Ways Leaders Set Themselves Apart,” in collaboration with Oxford Economics, identified the challenges, opportunities, value, and key technologies driving digital transformation. The findings specifically analyzed the performance of “digital leaders” – those who are connecting people, things, and businesses more intelligently, more effectively, and creating punctuated change faster than their less advanced rivals.

After analyzing the data, it was eye-opening to see that only three percent of companies (top 100) are successfully realizing their full potential through digital transformation. However, even more remarkable was that these leaders have four fundamental traits in common, regardless of their region of operation, their size, their organizational structure, or their industry.

We distilled these traits in the hope that others in the early stages of transformation or that are still struggling to find their bearings can embrace these principles in order to succeed. Ultimately I see these leaders as true ambidextrous organizations, managing evolutionary and revolutionary change simultaneously, willing to embrace innovation – not just on the edges of their business, but firmly into their core.

Here are the four traits that set these leaders apart from the rest:

Trait #1: They see digital transformation as truly transformational

An overwhelming majority (96%) of digital leaders view digital transformation as a core business goal that requires a unified digital mindset across the entire enterprise. But instead of allowing individual functions to change at their own pace, digital leaders prefer to evolve the organization to help ensure the success of their digital strategies.

The study found that 56% of these businesses regularly shift their organizational structure, which includes processes, partners, suppliers, and customers, compared to 10% of remaining companies. Plus, 70% actively bring lines of business together through cross-functional processes and technologies.

By creating a firm foundation for transformation, digital leaders are further widening the gap between themselves and their less advanced competitors as they innovate business models that can mitigate emerging risks and seize new opportunities quickly.

Trait #2: They focus on transforming customer-facing functions first

Although most companies believe technology, the pace of change, and growing global competition are the key global trends that will affect everything for years to come, digital leaders are expanding their frame of mind to consider the influence of customer empowerment. Executives who build a momentum of breakthrough innovation and industry transformation are the ones that are moving beyond the high stakes of the market to the activation of complete, end-to-end customer experiences.

In fact, 92% of digital leaders have established sophisticated digital transformation strategies and processes to drive transformational change in customer satisfaction and engagement, compared to 22% of their less mature counterparts. As a result, 70% have realized significant or transformational value from these efforts.

Trait #3: They create a virtuous cycle of digital talent

There’s little doubt that the competition for qualified talent is fierce. But for nearly three-quarters of companies that demonstrate digital-transformation leadership, it is easier to attract and retain talent because they are five times more likely to leverage digitization to change their talent management efforts.

The impact of their efforts goes beyond empowering recruiters to identify best-fit candidates, highlight risk factors and hiring errors, and predict long-term talent needs. Nearly half (48%) of digital leaders understand that they must invest heavily in the development of digital skills and technology to drive revenue, retain productive employees, and create new roles to keep up with their digital maturity over the next two years, compared to 30% of all surveyed executives.

Trait #4: They invest in next-generation technology using a bimodal architecture

A couple years ago, Peter Sondergaard, senior vice president at Gartner and global head of research, observed that “CIOs can’t transform their old IT organization into a digital startup, but they can turn it into a bi-modal IT organization. Forty-five percent of CIOs state they currently have a fast mode of operation, and we predict that 75% of IT organizations will be bimodal in some way by 2017.”

Based on the results of the SAP Center for Business Insight study, Sondergaard’s prediction was spot on. As digital leaders dive into advanced technologies, 72% are using a digital twin of the conventional IT organization to operate efficiently without disruption while refining innovative scenarios to resolve business challenges and integrate them to stay ahead of the competition. Unfortunately, only 30% of less advanced businesses embrace this view.

Working within this bimodal architecture is emboldening digital leaders to take on incredibly progressive technology. For example, the study found that 50% of these firms are using artificial intelligence and machine learning, compared to seven percent of all respondents. They are also leading the adoption curve of Big Data solutions and analytics (94% vs. 60%) and the Internet of Things (76% vs. 52%).

Digital leadership is a practice of balance, not pure digitization

Most executives understand that digital transformation is a critical driver of revenue growth, profitability, and business expansion. However, as digital leaders are proving, digital strategies must deliver a balance of organizational flexibility, forward-looking technology adoption, and bold change. And clearly, this approach is paying dividends for them. They are growing market share, increasing customer satisfaction, improving employee engagement, and, perhaps more important, achieving more profitability than ever before.

For any company looking to catch up to digital leaders, the conversation around digital transformation needs to change immediately to combat three deadly sins: Stop investing in one-off, isolated projects hidden in a single organization. Stop viewing IT as an enabler instead of a strategic partner. Stop walling off the rest of the business from siloed digital successes.

As our study shows, companies that treat their digital transformation as an all-encompassing, all-sharing, and all-knowing business imperative will be the ones that disrupt the competitive landscape and stay ahead of a constantly evolving economy.

Follow me on twitter @vivek_bapat 

For more insight on digital leaders, check out the SAP Center for Business Insight report, conducted in collaboration with Oxford Economics,SAP Digital Transformation Executive Study: 4 Ways Leaders Set Themselves Apart.”

Comments

About Vivek Bapat

Vivek Bapat is the Senior Vice President, Global Head of Marketing Strategy and Thought Leadership, at SAP. He leads SAP's Global Marketing Strategy, Messaging, Positioning and related Thought Leadership initiatives.