Sections

How Open Source Is Changing Software Innovation

Al Gillen

Open-source software has come a long way. After a period of maturation, most visibly to the wider public during the late 1990s and early 2000s – during the early days of Linux – the approach to developing technology under an open-use license, also known as open-source software, has now become mainstream.

In recent years, we have seen many commercially focused companies put both existing intellectual property and new projects that are being incubated into open source. This trend has been particularly strong with software used by developers as platforms or tools. Today, the vast majority of developer tools and platforms are open source or derived from open-source technology.

As enterprises across industries rush headlong into digital transformation, many have begun to emulate tech industry practices in how they use and produce software innovation and have thus increased adoption of, and even contribution to, open source. So what are the benefits of open-source engagement for the enterprise? Here are six broad benefits of using and contributing to open source.

  1. Not re-inventing the wheel. The most obvious reason to use open-source software is to build software faster without having to re-implement solutions to already solved problems. Companies must move fast to stay at the top of their game – and that means grabbing the best solutions contributed by a well-honed ecosystem and building their own added innovation on top of it. Doing anything else is suboptimal and will ultimately lead to a more burdensome maintenance and update responsibility, and ultimately falling behind competitors. Contributing software customization and adding value back to the larger open-source community can bring the benefit of better vetting and quality improvement of the code by the community.
  1. Ensuring strategic safety. It used to be that IT organizations bought important software only from large, established software vendors. Open source innovation now allows smaller players to provide viable solutions while assuring the buyers that they can have control over the technical direction of the software, thus avoiding unreasonable price increases or unnecessary product changes, and minimizing the potential for lock-in. The broadening of software supplies also expands the range of software solutions available to enterprises and keeps the larger software vendors on their toes. Using software that builds on an open source code-base also allows enterprises to participate in the technology’s evolution and maintain more control over the destiny of the products based on this software.
  1. Efficient experimentation and business acceleration. In the age of digital transformation, experimentation is the new mantra. Digital disruptors like Netflix, Uber, and Airbnb have revolutionized their industries and put the accelerated startup culture on the map with the famous “fail fast and often in the past” mantra. Modern enterprises have finally come to realize that to survive the disruption and keep up, they must adopt as much of that experimental culture as possible and overcome their fears of failure. Experimentation of this kind implies a speed in product and service development that building software in-house simply cannot deliver. By taking existing code developed by others, and leveraging this code in new offerings, enterprises can truly speed up their product cycles. Open source provides the digital shoulders that enterprises can stand on to compete. By further contributing new value back to open source, enterprises have a chance of generating mindshare for their offerings, also known as free marketing.
  1. The efficiency of standardized practices. Using open-source solutions means using somewhat standardized (in a de facto sense) solutions to problems. Such standardization of software patterns related to certain industries and verticals enforces a normalized and more optimized set of organizational practices that tend to be portable across that industry. This practice can simplify business process and allow companies to focus on competitive differentiation, rather than wasting resources on things that are not core to their business success.
  1. Cleaner and safer software. Creating software in open source means that engineers operate in daylight, enabling them to avoid the traps of plagiarized software and more easily stay clear of patents and copyrights. Additionally, the visibility open source provides can lead to more secure software and fewer vulnerability surprises, especially if a significant community evolves around the project and performs regular critical reviews. Many companies that create proprietary software have difficulties turning their large code-bases into open source because of the time-consuming intellectual property and security scrubbing processes needed to open the code. Open source IP-based businesses avoid this problem from the get-go. Starting new software initiatives in open source avoids these IP issues.
  1. Attracting, retaining, and motivating top developer talent. Beyond a good pay scale and a supportive work environment, there is little that can push developers to do high-quality work more than peer approval and the opportunity for recognition or even fame. Contributing software back to the community and allowing developers to enjoy the public recognition of their peers can be a powerful motivator and an important tool for employee retention. A similar dynamic is in play in the hiring process as tech companies compete with each other to build their software engineering teams. Offering the opportunity to be visible in a broader developer community, or attain a level of peer recognition, is potentially more important than paying top wages for star developers.
  1. Community-led innovation. With a diverse group of vendors and customers participating in open source efforts, open-source solutions tend to rapidly add functionality relevant to the audience faster than proprietary solutions can typically achieve. As a result, open source adopters are able to influence prioritization of capabilities added to open-source initiatives and quickly accomplish goals specific to their environment.

For more on future-focused digital innovation, see Business Networks: The Platforms For Future Innovation.

Comments

Al Gillen

About Al Gillen

Al Gillen is As Group Vice President of Software Development and Open Source at IDC. He oversees IDC's software development research portfolio. Research disciplines in this group include developer research covering census, demographics and developer activities; platform and cloud application services for developers, and developer lifecycle and quality assurance products. In addition, Al jointly oversees IDC's DevOps research program, and runs a program focused on the ecosystem of open source software pan-industry. In his 18th year at IDC, Al has participated in numerous IDC research areas, including infrastructure software (operating environments and virtualization software), enterprise servers, and developer software and services. He has long tracked open source software in infrastructure software markets, and now has expanded open source coverage to cover other market segments.

Four Ways Small Businesses Can Boost Security

Justin Fox

Many small business owners have long felt that they were relatively safe from the threat of cybercrime, assuming that cyber-criminals targeted only large corporations and multinationals. In fact, nothing could be further from the truth: Cyber-criminals are more than happy to prey upon anyone who may have money that can be swindled away. Criminals now see small businesses as soft targets, as they tend to have insufficient staff or resources to properly focus on security and protect themselves from attack. In light of this, small businesses owe it to themselves to improve security and minimize their chances of becoming a cyber-criminal’s next target.

CCTV

Using CCTV is one of the simplest means of deterring crime, as a visible warning to thieves is often enough to make them look elsewhere. If a crime is committed, CCTV can provide crucial evidence and help recover stolen property. The London riots of 2011 serve as a great example of how effective CCTV can be: police watched more than 200,000 hours of footage, which led to the prosecution of approximately 5,000 offenders.

Whilst it isn’t something that most business owners want to contemplate, petty theft by staff is a common occurrence, and CCTV can help tackle that issue. Of course, it’s preferable never to have to use the CCTV footage to trace a crime, but having it in place does provide a level of security and reassurance that your business is protected.

Tracking software

This isn’t a low-budget option, but it does a remarkably efficient job of tracking items that have been stolen. Tracking software can come built into a vehicle or piece of equipment, or it can be attached to an item. The software transmits a constant signal to a central hub, so if the item or vehicle is stolen, the central hub can detect its location, allowing the police to trace and apprehend the thieves. Whilst phones, laptops, and tablets come with this type of system built in, tracking devices can also be fitted to valuable assets such as vehicles and equipment.

Indeed, this technology has proven so effective that police in California have begun to use it to catch thieves swiping packages from doorsteps. Lured by decoy parcels with concealed GPS trackers, numerous criminals were successfully caught.

Cybersecurity

Properly protecting your business’s IT systems and Internet connection is critical, no matter what size the business is. Criminals are constantly looking for ways to steal money, personal data, business records, and even technical documents. It’s estimated that cybercrime costs the UK economy £73 billion per year, and no business can afford to be complacent about security, since we are all potential targets. Ransomware, in which a hacker holds data hostage and releases it only after receiving a payoff, is fast becoming the most common threat businesses face.

Unfortunately, many cyberattacks originate overseas, which makes it incredibly hard for police to successfully track or prosecute those responsible. It also makes it very unlikely that victims will ever recover their stolen money.

Education

The simplest and most effective way to protect your business is by educating staff about security. This education should cover physical security measures, such as being vigilant around the premises, as well as online security training. All employees should be trained to stay safe online and should be able to spot threats and suspected fraudulent activity. In an ideal world, all businesses would have a dedicated team to monitor and protect IT systems, but of course, that isn’t always possible for every business.

While younger generations may be more comfortable online and know the warning signs, those who did not grow up during the digital age may be less aware of potential problems. Various governments have taken measures to ensure that the students of today have at least a working knowledge of cybersecurity, but those outside the education system will have to find alternate sources of education.

It’s vital that businesses take security seriously and invest as much as possible in protecting themselves. Of course, preventative measures are better than reactive ones, but comprehensive plans do need to be in place to deal with security breaches if they actually occur. Even the simplest steps, such as robust password policies, can make a big difference and should not be ignored. Treating digital security as seriously as physical security will go a long way to protecting any business.

For more insight on cybersecurity, see The Future Of Cybersecurity: Trust As Competitive Advantage.

Comments

Justin Fox

About Justin Fox

Justin is a history graduate from the University of Kent, with a keen interest in current affairs and how the events of today are poised to affect the world of tomorrow. As technology continues to permeate virtually every aspect of our lives, Justin seeks to better understand the latest innovations, in the hopes of better understanding us a society as a result.

CFOs And CIOs Making Tech Decisions Together

Daniel Newman

We all know the feeling of dread that comes when waiting for a major project to be approved by the finance department. Some of us might even have a bit of inner dialogue: “Why does the CFO get to choose what’s right for my team? Finance doesn’t know anything about it!” Submitting the request can sometimes feel like throwing a Hail Mary – closing our eyes and hoping for the best. Hopefully the CFO is in a good mood – or not paying too much attention – to let the project slide through.

Fortunately, that type of relationship with the finance team (and the CFO in particular) is falling by the wayside in the digital age. We’ve talked a lot about the importance of breaking down silos in today’s work environment. It’s no longer feasible to be an agile, forward-thinking company without collaborative decision-making. That doesn’t just apply to teams like marketing, customer service, and IT, however. It goes across the board, from your human resources teams to your CFO.

The digital transformation is forcing today’s CFOs to get up close and personal with technology. CFOs are beginning to ask questions about the technology architecture to get a better understanding of the programs and systems in place. They’re learning more about how technology works, how it can save money and boost sales, and how it can improve overall efficiencies – all of which fall under a CFO’s purview. That’s why it is so critical that CIOs and CFOs begin to partner on the road to digital transformation, whether that means modernizing legacy systems or developing a new technology to improve customer engagement. The following are a few ways CIOs can help CFOs better understand the importance of technology as we move ahead in the digital marketplace.

1. Speak his or her language

Your CFO may not have a technology background, and that’s OK. Use your communication skills to strip away the technical names and descriptions, and instead show your CFO the power of what your tech can do. Speak his or her language. Include data details like how much time and money can be saved, how many new customers could be gained, and how many new sales can be added using the software or apps you are recommending. If your CFO is interested, offer a demonstration to help him or her understand how the software works and how it will benefit your team.

2. Focus on benefits, not buzzwords

Things like analytics, Big Data, and machine learning have a real, tangible benefit for your company. Rather than throwing around buzzwords, focus on the true value they offer. Analytics can help you improve targeting for the marketing team; machine learning can help you personalize customer interactions; Big Data can help your company make more-informed investment decisions. These are the kinds of benefits your CFO will care about.

3. Focus on long-term growth

A constant challenge for CFOs is to balance spending and investing without hampering productivity and competitiveness. Know that your CFO will always be weighing the cost and risk with the good you are offering. Explain how the project will benefit the company in the long term. Don’t hide the costs. Just explain how the impact will pay dividends.

4. Share the industry’s story

It can be easy to miss the forest for the trees. Help your CFO understand what’s happening in your industry, not just in your company. Give them a rundown of what your competitors are doing in the technology area and how your company can be (or has been) impacted by falling behind. Show them how your project will not just help your company grow, but also help it get ahead of your competitors.

5. Stay connected

Technology is an ever-changing beast. Share with your CFO news and updates on new technology impacting your industry, and schedule regular meetings to discuss new opportunities to save money, increase sales, and boost productivity. This kind of collaboration will help your entire company work more efficiently, eliminating delays and last-minute budget surprises.

There is no room for silos in the digital economy, even in the world of finance. Although you may never have considered your CFO a member of the tech team, it’s time to start. Doing so will help your company move even further ahead and create an important ally in the digital transformation.

For more on this topic, see:

Photo credit: rajpal14 Flickr via Compfight cc

Comments

Daniel Newman

About Daniel Newman

Daniel Newman serves as the Co-Founder and CEO of EC3, a quickly growing hosted IT and Communication service provider. Prior to this role Daniel has held several prominent leadership roles including serving as CEO of United Visual. Parent company to United Visual Systems, United Visual Productions, and United GlobalComm; a family of companies focused on Visual Communications and Audio Visual Technologies. Daniel is also widely published and active in the Social Media Community. He is the Author of Amazon Best Selling Business Book "The Millennial CEO." Daniel also Co-Founded the Global online Community 12 Most and was recognized by the Huffington Post as one of the 100 Business and Leadership Accounts to Follow on Twitter. Newman is an Adjunct Professor of Management at North Central College. He attained his undergraduate degree in Marketing at Northern Illinois University and an Executive MBA from North Central College in Naperville, IL. Newman currently resides in Aurora, Illinois with his wife (Lisa) and his two daughters (Hailey 9, Avery 5). A Chicago native all of his life, Newman is an avid golfer, a fitness fan, and a classically trained pianist

Heroes in the Race to Save Antibiotics

Dr. David Delaney, Joseph Miles, Walt Ellenberger, Saravana Chandran, and Stephanie Overby

Last August, a woman arrived at a Reno, Nevada, hospital and told the attending doctors that she had recently returned from an extended trip to India, where she had broken her right thighbone two years ago. The woman, who was in her 70s, had subsequently developed an infection in her thigh and hip for which she was hospitalized in India several times. The Reno doctors recognized that the infection was serious—and the visit to India, where antibiotic-resistant bacteria runs rampant, raised red flags.

When none of the 14 antibiotics the physicians used to treat the woman worked, they sent a sample of the bacterium to the U.S. Centers for Disease Control (CDC) for testing. The CDC confirmed the doctors’ worst fears: the woman had a class of microbe called carbapenem-resistant Enterobacteriaceae (CRE). Carbapenems are a powerful class of antibiotics used as last-resort treatment for multidrug-resistant infections. The CDC further found that, in this patient’s case, the pathogen was impervious to all 26 antibiotics approved by the U.S. Food and Drug Administration (FDA).

In other words, there was no cure.

This is just the latest alarming development signaling the end of the road for antibiotics as we know them. In September, the woman died from septic shock, in which an infection takes over and shuts down the body’s systems, according to the CDC’s Morbidity and Mortality Weekly Report.

Other antibiotic options, had they been available, might have saved the Nevada woman. But the solution to the larger problem won’t be a new drug. It will have to be an entirely new approach to the diagnosis of infectious disease, to the use of antibiotics, and to the monitoring of antimicrobial resistance (AMR)—all enabled by new technology.

But that new technology is not being implemented fast enough to prevent what former CDC director Tom Frieden has nicknamed nightmare bacteria. And the nightmare is becoming scarier by the year. A 2014 British study calculated that 700,000 people die globally each year because of AMR. By 2050, the global cost of antibiotic resistance could grow to 10 million deaths and US$100 trillion a year, according to a 2014 estimate. And the rate of AMR is growing exponentially, thanks to the speed with which humans serving as hosts for these nasty bugs can move among healthcare facilities—or countries. In the United States, for example, CRE had been seen only in North Carolina in 2000; today it’s nationwide.

Abuse and overuse of antibiotics in healthcare and livestock production have enabled bacteria to both mutate and acquire resistant genes from other organisms, resulting in truly pan-drug resistant organisms. As ever-more powerful superbugs continue to proliferate, we are potentially facing the deadliest and most costly human-made catastrophe in modern times.

“Without urgent, coordinated action by many stakeholders, the world is headed for a post-antibiotic era, in which common infections and minor injuries which have been treatable for decades can once again kill,” said Dr. Keiji Fukuda, assistant director-general for health security for the World Health Organization (WHO).

Even if new antibiotics could solve the problem, there are obstacles to their development. For one thing, antibiotics have complex molecular structures, which slows the discovery process. Further, they aren’t terribly lucrative for pharmaceutical manufacturers: public health concerns call for new antimicrobials to be financially accessible to patients and used conservatively precisely because of the AMR issue, which reduces the financial incentives to create new compounds. The last entirely new class of antibiotic was introduced 30 year ago. Finally, bacteria will develop resistance to new antibiotics as well if we don’t adopt new approaches to using them.

Technology can play the lead role in heading off this disaster. Vast amounts of data from multiple sources are required for better decision making at all points in the process, from tracking or predicting antibiotic-resistant disease outbreaks to speeding the potential discovery of new antibiotic compounds. However, microbes will quickly adapt and resist new medications, too, if we don’t also employ systems that help doctors diagnose and treat infection in a more targeted and judicious way.

Indeed, digital tools can help in all four actions that the CDC recommends for combating AMR: preventing infections and their spread, tracking resistance patterns, improving antibiotic use, and developing new diagnostics and treatment.

Meanwhile, individuals who understand both the complexities of AMR and the value of technologies like machine learning, human-computer interaction (HCI), and mobile applications are working to develop and advocate for solutions that could save millions of lives.

Keeping an Eye Out for Outbreaks

Like others who are leading the fight against AMR, Dr. Steven Solomon has no illusions about the difficulty of the challenge. “It is the single most complex problem in all of medicine and public health—far outpacing the complexity and the difficulty of any other problem that we face,” says Solomon, who is a global health consultant and former director of the CDC’s Office of Antimicrobial Resistance.

Solomon wants to take the battle against AMR beyond the laboratory. In his view, surveillance—tracking and analyzing various data on AMR—is critical, particularly given how quickly and widely it spreads. But surveillance efforts are currently fraught with shortcomings. The available data is fragmented and often not comparable. Hospitals fail to collect the representative samples necessary for surveillance analytics, collecting data only on those patients who experience resistance and not on those who get better. Laboratories use a wide variety of testing methods, and reporting is not always consistent or complete.

Surveillance can serve as an early warning system. But weaknesses in these systems have caused public health officials to consistently underestimate the impact of AMR in loss of lives and financial costs. That’s why improving surveillance must be a top priority, says Solomon, who previously served as chair of the U.S. Federal Interagency Task Force on AMR and has been tracking the advance of AMR since he joined the U.S. Public Health Service in 1981.

A Collaborative Diagnosis

Ineffective surveillance has also contributed to huge growth in the use of antibiotics when they aren’t warranted. Strong patient demand and financial incentives for prescribing physicians are blamed for antibiotics abuse in China. India has become the largest consumer of antibiotics on the planet, in part because they are prescribed or sold for diarrheal diseases and upper respiratory infections for which they have limited value. And many countries allow individuals to purchase antibiotics over the counter, exacerbating misuse and overuse.

In the United States, antibiotics are improperly prescribed 50% of the time, according to CDC estimates. One study of adult patients visiting U.S. doctors to treat respiratory problems found that more than two-thirds of antibiotics were prescribed for conditions that were not infections at all or for infections caused by viruses—for which an antibiotic would do nothing. That’s 27 million courses of antibiotics wasted a year—just for respiratory problems—in the United States alone.

And even in countries where there are national guidelines for prescribing antibiotics, those guidelines aren’t always followed. A study published in medical journal Family Practice showed that Swedish doctors, both those trained in Sweden and those trained abroad, inconsistently followed rules for prescribing antibiotics.

Solomon strongly believes that, worldwide, doctors need to expand their use of technology in their offices or at the bedside to guide them through a more rational approach to antibiotic use. Doctors have traditionally been reluctant to adopt digital technologies, but Solomon thinks that the AMR crisis could change that. New digital tools could help doctors and hospitals integrate guidelines for optimal antibiotic prescribing into their everyday treatment routines.

“Human-computer interactions are critical, as the amount of information available on antibiotic resistance far exceeds the ability of humans to process it,” says Solomon. “It offers the possibility of greatly enhancing the utility of computer-assisted physician order entry (CPOE), combined with clinical decision support.” Healthcare facilities could embed relevant information and protocols at the point of care, guiding the physician through diagnosis and prescription and, as a byproduct, facilitating the collection and reporting of antibiotic use.

Cincinnati Children’s Hospital’s antibiotic stewardship division has deployed a software program that gathers information from electronic medical records, order entries, computerized laboratory and pathology reports, and more. The system measures baseline antimicrobial use, dosing, duration, costs, and use patterns. It also analyzes bacteria and trends in their susceptibilities and helps with clinical decision making and prescription choices. The goal, says Dr. David Haslam, who heads the program, is to decrease the use of “big gun” super antibiotics in favor of more targeted treatment.

While this approach is not yet widespread, there is consensus that incorporating such clinical-decision support into electronic health records will help improve quality of care, contain costs, and reduce overtreatment in healthcare overall—not just in AMR. A 2013 randomized clinical trial finds that doctors who used decision-support tools were significantly less likely to order antibiotics than those in the control group and prescribed 50% fewer broad-spectrum antibiotics.

Putting mobile devices into doctors’ hands could also help them accept decision support, believes Solomon. Last summer, Scotland’s National Health Service developed an antimicrobial companion app to give practitioners nationwide mobile access to clinical guidance, as well as an audit tool to support boards in gathering data for local and national use.

“The immediacy and the consistency of the input to physicians at the time of ordering antibiotics may significantly help address the problem of overprescribing in ways that less-immediate interventions have failed to do,” Solomon says. In addition, handheld devices with so-called lab-on-a-chip  technology could be used to test clinical specimens at the bedside and transmit the data across cellular or satellite networks in areas where infrastructure is more limited.

Artificial intelligence (AI) and machine learning can also become invaluable technology collaborators to help doctors more precisely diagnose and treat infection. In such a system, “the physician and the AI program are really ‘co-prescribing,’” says Solomon. “The AI can handle so much more information than the physician and make recommendations that can incorporate more input on the type of infection, the patient’s physiologic status and history, and resistance patterns of recent isolates in that ward, in that hospital, and in the community.”

Speed Is Everything

Growing bacteria in a dish has never appealed to Dr. James Davis, a computational biologist with joint appointments at Argonne National Laboratory and the University of Chicago Computation Institute. The first of a growing breed of computational biologists, Davis chose a PhD advisor in 2004 who was steeped in bioinformatics technology “because you could see that things were starting to change,” he says. He was one of the first in his microbiology department to submit a completely “dry” dissertation—that is, one that was all digital with nothing grown in a lab.

Upon graduation, Davis wanted to see if it was possible to predict whether an organism would be susceptible or resistant to a given antibiotic, leading him to explore the potential of machine learning to predict AMR.

As the availability of cheap computing power has gone up and the cost of genome sequencing has gone down, it has become possible to sequence a pathogen sample in order to detect its AMR resistance mechanisms. This could allow doctors to identify the nature of an infection in minutes instead of hours or days, says Davis.

Davis is part of a team creating a giant database of bacterial genomes with AMR metadata for the Pathosystems Resource Integration Center (PATRIC), funded by the U.S. National Institute of Allergy and Infectious Diseases to collect data on priority pathogens, such as tuberculosis and gonorrhea.

Because the current inability to identify microbes quickly is one of the biggest roadblocks to making an accurate diagnosis, the team’s work is critically important. The standard method for identifying drug resistance is to take a sample from a wound, blood, or urine and expose the resident bacteria to various antibiotics. If the bacterial colony continues to divide and thrive despite the presence of a normally effective drug, it indicates resistance. The process typically takes between 16 and 20 hours, itself an inordinate amount of time in matters of life and death. For certain strains of antibiotic-resistant tuberculosis, though, such testing can take a week. While physicians are waiting for test results, they often prescribe broad-spectrum antibiotics or make a best guess about what drug will work based on their knowledge of what’s happening in their hospital, “and in the meantime, you either get better,” says Davis, “or you don’t.”

At PATRIC, researchers are using machine-learning classifiers to identify regions of the genome involved in antibiotic resistance that could form the foundation for a “laboratory free” process for predicting resistance. Being able to identify the genetic mechanisms of AMR and predict the behavior of bacterial pathogens without petri dishes could inform clinical decision making and improve reaction time. Thus far, the researchers have developed machine-learning classifiers for identifying antibiotic resistance in Acinetobacter baumannii (a big player in hospital-acquired infection), methicillin-resistant Staphylococcus aureus (a.k.a. MRSA, a worldwide problem), and Streptococcus pneumoniae (a leading cause of bacterial meningitis), with accuracies ranging from 88% to 99%.

Houston Methodist Hospital, which uses the PATRIC database, is researching multidrug-resistant bacteria, specifically MRSA. Not only does resistance increase the cost of care, but people with MRSA are 64% more likely to die than people with a nonresistant form of the infection, according to WHO. Houston Methodist is investigating the molecular genetic causes of drug resistance in MRSA in order to identify new treatment approaches and help develop novel antimicrobial agents.

The Hunt for a New Class of Antibiotics

There are antibiotic-resistant bacteria, and then there’s Clostridium difficile—a.k.a. C. difficile—a bacterium that attacks the intestines even in young and healthy patients in hospitals after the use of antibiotics.

It is because of C. difficile that Dr. L. Clifford McDonald jumped into the AMR fight. The epidemiologist was finishing his work analyzing the spread of SARS in Toronto hospitals in 2004 when he turned his attention to C. difficile, convinced that the bacteria would become more common and more deadly. He was right, and today he’s at the forefront of treating the infection and preventing the spread of AMR as senior advisor for science and integrity in the CDC’s Division of Healthcare Quality Promotion. “[AMR] is an area that we’re funding heavily…insofar as the CDC budget can fund anything heavily,” says McDonald, whose group has awarded $14 million in contracts for innovative anti-AMR approaches.

Developing new antibiotics is a major part of the AMR battle. The majority of new antibiotics developed in recent years have been variations of existing drug classes. It’s been three decades since the last new class of antibiotics was introduced. Less than 5% of venture capital in pharmaceutical R&D is focused on antimicrobial development. A 2008 study found that less than 10% of the 167 antibiotics in development at the time had a new “mechanism of action” to deal with multidrug resistance. “The low-hanging fruit [of antibiotic development] has been picked,” noted a WHO report.

Researchers will have to dig much deeper to develop novel medicines. Machine learning could help drug developers sort through much larger data sets and go about the capital-intensive drug development process in a more prescriptive fashion, synthesizing those molecules most likely to have an impact.

McDonald believes that it will become easier to find new antibiotics if we gain a better understanding of the communities of bacteria living in each of us—as many as 1,000 different types of microbes live in our intestines, for example. Disruption to those microbial communities—our “microbiome”—can herald AMR. McDonald says that Big Data and machine learning will be needed to unlock our microbiomes, and that’s where much of the medical community’s investment is going.

He predicts that within five years, hospitals will take fecal samples or skin swabs and sequence the microorganisms in them as a kind of pulse check on antibiotic resistance. “Just doing the bioinformatics to sort out what’s there and the types of antibiotic resistance that might be in that microbiome is a Big Data challenge,” McDonald says. “The only way to make sense of it, going forward, will be advanced analytic techniques, which will no doubt include machine learning.”

Reducing Resistance on the Farm

Bringing information closer to where it’s needed could also help reduce agriculture’s contribution to the antibiotic resistance problem. Antibiotics are widely given to livestock to promote growth or prevent disease. In the United States, more kilograms of antibiotics are administered to animals than to people, according to data from the FDA.

One company has developed a rapid, on-farm diagnostics tool to provide livestock producers with more accurate disease detection to make more informed management and treatment decisions, which it says has demonstrated a 47% to 59% reduction in antibiotic usage. Such systems, combined with pressure or regulations to reduce antibiotic use in meat production, could also help turn the AMR tide.

Breaking Down Data Silos Is the First Step

Adding to the complexity of the fight against AMR is the structure and culture of the global healthcare system itself. Historically, healthcare has been a siloed industry, notorious for its scattered approach focused on transactions rather than healthy outcomes or the true value of treatment. There’s no definitive data on the impact of AMR worldwide; the best we can do is infer estimates from the information that does exist.

The biggest issue is the availability of good data to share through mobile solutions, to drive HCI clinical-decision support tools, and to feed supercomputers and machine-learning platforms. “We have a fragmented healthcare delivery system and therefore we have fragmented information. Getting these sources of data all into one place and then enabling them all to talk to each other has been problematic,” McDonald says.

Collecting, integrating, and sharing AMR-related data on a national and ultimately global scale will be necessary to better understand the issue. HCI and mobile tools can help doctors, hospitals, and public health authorities collect more information while advanced analytics, machine learning, and in-memory computing can enable them to analyze that data in close to real time. As a result, we’ll better understand patterns of resistance from the bedside to the community and up to national and international levels, says Solomon. The good news is that new technology capabilities like AI and new potential streams of data are coming online as an era of data sharing in healthcare is beginning to dawn, adds McDonald.

The ideal goal is a digitally enabled virtuous cycle of information and treatment that could save millions of dollars, lives, and perhaps even civilization if we can get there. D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.


About the Authors:

Dr. David Delaney is Chief Medical Officer for SAP.

Joseph Miles is Global Vice President, Life Sciences, for SAP.

Walt Ellenberger is Senior Director Business Development, Healthcare Transformation and Innovation, for SAP.

Saravana Chandran is Senior Director, Advanced Analytics, for SAP.

Stephanie Overby is an independent writer and editor focused on the intersection of business and technology.

Comments

Tags:

Small And Midsize Businesses Have The Capacity To Drive Europe’s Future As A Digital Superpower

Katja Mehl

Part 10 of the “Road to Digital Transformation” series

Representing 99.8% of all companies throughout Europe, small and midsize businesses have tremendous power when it comes to impacting the region’s economy. One innovation at a time, they’re transforming entire industries, propelling emerging industries forward with adjacent offerings, and even supersizing a favorite childhood toy to make living conditions better for the poor and homeless. But perhaps the greatest evolution is found in the growing adoption of technology among firms.

According to the IDC InfoBrief “The Next Steps in Digital Transformation: How Small and Midsize Companies Are Applying Technology to Meet Key Business Goals with Insights for Europe,” sponsored by SAP, 35.4% of all European firms feel that their adoption of digital technology is either advanced or well underway. Germany and France are great examples of countries that are embracing advanced business networks and automation technology – such as the Internet of Things – to boost productivity and computerize or consolidate roles left empty due to long-term labor shortages.

Despite the progress made in some countries, I am also aware of others that are still resistant to digitizing their economy and automating operations. What’s the difference between firms that are digital leaders and those that are slow to mature? From my perspective in working with a variety of businesses throughout Europe, it’s a combination of diversity and technology availability.

digital transformation self-assessment

Source: “The Next Steps in Digital Transformation: How Small and Midsize Companies Are Applying Technology to Meet Key Business Goals with Insights for Europe,” IDC InfoBrief, sponsored by SAP, 2017. 

Opportunities abound with digital transformation

European companies are hardly homogenous. Comprising 47 countries across the continent, they serve communities that speak any of 225 spoken languages. Each one is experiencing various stages of digital development, economic stability, and workforce needs.

Nevertheless, as a whole, European firms do prioritize customer acquisition as well as improving efficiency and reducing costs. Over one-third of small and midsize companies are investing in collaboration software, customer relationship management solutions, e-commerce platforms, analytics, and talent management applications. Steadily, business leaders are finding better ways to go beyond data collection by applying predictive analytics to gain real-time insight from predictive analytics and machine learning to automate processes where possible.

Small and midsize businesses have a distinct advantage in this area over their larger rivals because they can, by nature, adopt new technology and practices quickly and act on decisions with greater agility. Nearly two-thirds (64%) of European firms are embracing the early stages of digitalization and planning to mature over time. Yet, the level of adoption depends solely on the leadership team’s commitment.

For many small and midsize companies across this region, the path to digital maturity resides in the cloud, more so than on-premise software deployment. For example, the flexibility associated with cloud deployment is viewed as a top attribute, especially among U.K. firms. This brings us back to the diversity of our region. Some countries prioritize personal data security while others may be more concerned with the ability to access the information they need in even the most remote of areas.

Technology alone does not deliver digital transformation

Digital transformation is certainly worth the effort for European firms. Between 60%–90% of small and midsize European businesses say their technology investments have met or exceeded their expectations – indicative of the steady, powerhouse transitions enabled by cloud computing. Companies are now getting the same access to the latest technology, data storage, and IT resources.

However, it is also important to note that a cloud platform is only as effective as the long-term digital strategy that it enables. To invigorate transformative changes, leadership needs to go beyond technology and adopt a mindset that embraces new ideas, tests the fitness of business models and processes continuously, and allows the flexibility to evolve the company as quickly as market dynamics change. By taking a step back and integrating digital objectives throughout the business strategy, leadership can pull together the elements needed to turn technology investments into differentiating, sustainable change. For example, the best talent with the right skills is hired. Plus, partners and suppliers with a complementary or shared digital vision and capability are onboarded.

The IDC Infobrief confirms what I have known all along: Small and midsize businesses are beginning to digitally mature and maintain a strategy that is relevant to their end-to-end processes. And furthering their digital transformation go hand in hand with the firms’ ability to ignite a transformational force that will likely progress Europe’s culture, social structure, and economy. 

To learn how small and midsize businesses across Europe are digitally transforming themselves to advance their future success, check out the IDC InfoBrief “The Next Steps in Digital Transformation: How Small and Midsize Companies Are Applying Technology to Meet Key Business Goals with Insights for Europe,” sponsored by SAP. For more region-specific perspectives on digital transformation, be sure to check every Tuesday for new installments to our blog series “The Road to Digital Transformation.”

 

Comments

Katja Mehl

About Katja Mehl

Katja Mehl is Head of Marketing for Europe, Middle East, and Africa at SAP.