If the stats are to be believed, there is a veritable tsunami of data headed our way, and those who are unable to adopt the tools and technology to ride the wave will be left floundering. According to some, we are soon entering the brontobyte era of sensor data, examples of which are self-driving cars that alone will generate approximately 1 gigabyte of sensor data per second. Airplanes are already creating 2.5B terabytes of data collectively. CCTV in the London Underground generates around 2 petabytes of data every day, and the Square Kilometre Array (SKA) radio telescopes nestled away on the flat desert planes just a short 315km northeast of Geraldton, Western Australia, will pull down more data in a day than is currently produced by the Internet to date when they come online in 2020. This will require an “Exaflop-capable” supercomputer to crunch through the data created.
Source: HP ‘The Brontobyte Era in the world of interconnected digital things’
Aside from confirming a growing suspicion that data scientists and mathematicians host a wicked sense of humor when it comes to inventing new terminology for data sizing, what does this all actually mean? And why should we even care? It is a sad realization for a person like myself working in the IT industry to accept and come to terms with, but let’s be honest: data is boring for most people. Just by using the term “data” we’re creating a barrier between ourselves and the majority of people that we’d like to be communicating with, because most people will never realize or even care that they are producing or consuming data.
But this isn’t necessarily a bad thing, and it isn’t the reason why people won’t be able to capitalize on a Big Data future. In truth, this is exactly how it should be with technology. The most successful technological advancements are those that become invisible and ubiquitous to our daily lives to the point that you forget they are there.
It’s a lot like electricity. Most people don’t know how electricity works or how many kilowatts their house is consuming by the hour, but when you walk into your house and flick the switch, the lights come on, the computer hums to life, and TV wakes up like magic. Once the infrastructure has been built, all that is required by end users is to plug in and switch on to extract value from it. Data is the same. People don’t care about how much data is being produced, they only care about the valuable end user product, service, or experience that data provides them.
Without the proper context or supporting infrastructure, data on its own is meaningless and the value is trapped. It’s only when data is put into context that it becomes useful information that can be tapped into and extracted by others. Much like a stray lightning bolt of electricity cannot easily be captured, despite having the power to supply a whole household with all their energy needs for a month, the sheer quantity of a brontobyte of data being made available is meaningless if it is not inherently simple for the majority of potential users to access it, search it, put it in context, and use the data productively.
In many instances, we are still at the stage today of concerning ourselves with how much data is being produced, rather than how its maximum value can be extracted. This is typified by our current approach to data sharing, in which we are pursuing traditional forms and publishing on individual portals for specific and targeted purposes rather than looking at data as a vast, interconnected web that can and must be tapped into. There are portals for weather data, healthcare data, census data, and so on. Yet data portals only provide limited value to their users because they do not connect with one another as seamlessly as they could. Often each of these portals are isolated – within the organizations that house them, but also from the rest of the data universe including news, social media, blogs, and other relevant sources.
There are many ways in which we could strengthen our understanding and treatment of data in order to magnify the value contained within it. In the first instance we need to invent better ways to search the web of data. There is a network effect where data becomes more useful and its value is magnified as more people and machines use it, add to it, and maintain it. Searching the vast web of data requires being able to find it, access it, and understand its context. In the future, searching vast portals of data will need to become as simple as it is today to search for news or documents on the Internet.
Imagine if you could join the dots between all data sources and discover hidden connections? This would put data into context in ways today that was never possible before. An organization could move from questioning, for example, “Why did health outcomes in a particular rural area decrease last month?” to identifying, “Did the reduction in health outcomes in a particular rural area last year have anything to do with the storms and power outages?” In this future, the value of data would be maximized as information would flow more easily through an economy for users to derive new insights, find the data they need to design new products and services, and incrementally add more information to the growing knowledge base.
This is the cusp of where we find ourselves today and the challenge that those who wish to capitalize on the vast oceans of data face. As the volume of data grows exponentially, data custodians will need to start thinking about their data as an infrastructure and as an asset, contributing to public value creation in a similar manner as transportation, water, and electrical infrastructure. Doing so can provide a network of contextualised information that people, government, and business can consume to address the problems that matter and create public value.
To find out more about the SAP Institute for Digital Government visit www.sap.com/sidg, follow us on Twitter @sapsidg and email us at firstname.lastname@example.org.
The Digitalist Magazine is your online destination for everything you need to know to lead your enterprise’s digital transformation.
Read the Digitalist Magazine and get the latest insights about the digital economy that you can capitalize on today.
About Natalie Kenny
Natalie Kenny is an Industry Value Engineer at SAP, focused on digital transformation of public sector organisations across Australia and New Zealand. She is dedicated to helping governments re-imagine future ways to interact with and deliver services to citizens.
Natalie's key focus area is exploring how digital technologies can help un-tap the immense value that lies in government data to extract new value and insights and help solve challenging civic problems.
The business world is now firmly in the age of data. Not that data wasn’t relevant before; it was just nowhere close to the speed and volume that’s available to us today. Businesses are buckling under the deluge of petabytes, exabytes, and zettabytes. Within these bytes lie valuable information on customer behavior, key business insights, and revenue generation. However, all that data is practically useless for businesses without the ability to identify the right data. Plus, if they don’t have the talent and resources to capture the right data, organize it, dissect it, draw actionable insights from it and, finally, deliver those insights in a meaningful way, their data initiatives will fail.
Rise of the CDO
Companies of all sizes can easily find themselves drowning in data generated from websites, landing pages, social streams, emails, text messages, and many other sources. Additionally, there is data in their own repositories. With so much data at their disposal, companies are under mounting pressure to utilize it to generate insights. These insights are critical because they can (and should) drive the overall business strategy and help companies make better business decisions. To leverage the power of data analytics, businesses need more “top-management muscle” specialized in the field of data science. This specialized field has lead to the creation of roles like Chief Data Officer (CDO).
In addition, with more companies undertaking digital transformations, there’s greater impetus for the C-suite to make data-driven decisions. The CDO helps make data-driven decisions and also develops a digital business strategy around those decisions. As data grows at an unstoppable rate, becoming an inseparable part of key business functions, we will see the CDO act as a bridge between other C-suite execs.
Data skills – an emerging business necessity
So far, only large enterprises with bigger data mining and management needs maintain in-house solutions. These in-house teams and technologies handle the growing sets of diverse and dispersed data. Others work with third-party service providers to develop and execute their big data strategies.
As the amount of data grows, the need to mine it for insights becomes a key business requirement. For both large and small businesses, data-centric roles will experience endless upward mobility. These roles include data anlysts and scientists. There is going to be a huge opportunity for critical thinkers to turn their analytical skills into rapidly growing roles in the field of data science. In fact, data skills are now a prized qualification for titles like IT project managers and computer systems analysts.
Forbes cited the McKinsey Global Institute’s prediction that by 2018 there could be a massive shortage of data-skilled professionals. This indicates a disruption at the demand-supply level with the needs for data skills at an all-time high. With an increasing number of companies adopting big data strategies, salaries for data jobs are going through the roof. This is turning the position into a highly coveted one.
According to Harvard Professor Gary King, “There is a big data revolution. The big data revolution is that now we can do something with the data.” The big problem is that most enterprises don’t know what to do with data. Data professionals are helping businesses figure that out. So if you’re casting about for where to apply your skills and want to take advantage of one of the best career paths in the job market today, focus on data science.
The Digitalist Magazine is your online destination for everything you need to know to lead your enterprise’s digital transformation.
Read the Digitalist Magazine and get the latest insights about the digital economy that you can capitalize on today.
About Daniel Newman
Daniel Newman serves as the Co-Founder and CEO of EC3, a quickly growing hosted IT and Communication service provider. Prior to this role Daniel has held several prominent leadership roles including serving as CEO of United Visual. Parent company to United Visual Systems, United Visual Productions, and United GlobalComm; a family of companies focused on Visual Communications and Audio Visual Technologies.
Daniel is also widely published and active in the Social Media Community. He is the Author of Amazon Best Selling Business Book "The Millennial CEO." Daniel also Co-Founded the Global online Community 12 Most and was recognized by the Huffington Post as one of the 100 Business and Leadership Accounts to Follow on Twitter.
Newman is an Adjunct Professor of Management at North Central College. He attained his undergraduate degree in Marketing at Northern Illinois University and an Executive MBA from North Central College in Naperville, IL. Newman currently resides in Aurora, Illinois with his wife (Lisa) and his two daughters (Hailey 9, Avery 5).
A Chicago native all of his life, Newman is an avid golfer, a fitness fan, and a classically trained pianist
Self-service BI and data discovery will drive the number of users using the BI solutions to rapidly expand. Yet all of these more casual users will not be well versed in BI and visualization best practices.
When your user base rapidly expands to more casual users, you need to help educate them on what is important. For example, one IT manager told me that his casual BI users were making visualizations with very difficult-to-read charts and customizing color palettes to incredible degrees.
I had a similar experience when I was a technical writer. One of our lead writers was so concerned with readability of every sentence that he was going through the 300+ page manuals (yes, they were printed then) and manually adjusting all of the line breaks and page breaks. (!) Yes, readability was incrementally improved. But now any number of changes–technical capabilities, edits, inserting larger graphics—required re-adjusting all of those manual “optimizations.” The time it took just to do the additional optimization was incredible, much less the maintenance of these optimizations! Meanwhile, the technical writing team was falling behind on new deliverables.
The same scenario applies to your new casual BI users. This new group needs guidance to help them focus on the highest value practices:
Customization of color and appearance of visualizations: When is this customization necessary for a management deliverable, versus indulging an OCD tendency? I too have to stop myself from obsessing about the font, line spacing, and that a certain blue is just a bit different than another shade of blue. Yes, these options do matter. But help these casual users determine when that time is well spent.
Proper visualizations: When is a spinning 3D pie chart necessary to grab someone’s attention? BI professionals would firmly say “NEVER!” But these casual users do not have a lot of depth on BI best practices. Give them a few simple guidelines as to when “flash” needs to subsume understanding. Consider offering a monthly one-hour Lunch and Learn that shows them how to create impactful, polished visuals. Understanding if their visualizations are going to be viewed casually on the way to a meeting, or dissected at a laptop, also helps determine how much time to spend optimizing a visualization. No, you can’t just mandate that they all read Tufte.
Predictive: Provide advanced analytics capabilities like forecasting and regression directly in their casual BI tools. Using these capabilities will really help them wow their audience with substance instead of flash.
Feature requests: Make sure you understand the motivation and business value behind some of the casual users’ requests. These casual users are less likely to understand the implications of supporting specific requests across an enterprise, so make sure you are collaborating on use cases and priorities for substantive requests.
By working with your casual BI users on the above points, you will be able to collectively understand when the absolute exact request is critical (and supports good visualization practices), and when it is an “optimization” that may impact productivity. In many cases, “good” is good enough for the fast turnaround of data discovery.
Next week, I’ll wrap this series up with hints on getting your casual users to embrace the “we” not “me” mentality.
Last August, a woman arrived at a Reno, Nevada, hospital and told the attending doctors that she had recently returned from an extended trip to India, where she had broken her right thighbone two years ago. The woman, who was in her 70s, had subsequently developed an infection in her thigh and hip for which she was hospitalized in India several times. The Reno doctors recognized that the infection was serious—and the visit to India, where antibiotic-resistant bacteria runs rampant, raised red flags.
When none of the 14 antibiotics the physicians used to treat the woman worked, they sent a sample of the bacterium to the U.S. Centers for Disease Control (CDC) for testing. The CDC confirmed the doctors’ worst fears: the woman had a class of microbe called carbapenem-resistant Enterobacteriaceae (CRE). Carbapenems are a powerful class of antibiotics used as last-resort treatment for multidrug-resistant infections. The CDC further found that, in this patient’s case, the pathogen was impervious to all 26 antibiotics approved by the U.S. Food and Drug Administration (FDA).
In other words, there was no cure.
This is just the latest alarming development signaling the end of the road for antibiotics as we know them. In September, the woman died from septic shock, in which an infection takes over and shuts down the body’s systems, according to the CDC’s Morbidity and Mortality Weekly Report.
Other antibiotic options, had they been available, might have saved the Nevada woman. But the solution to the larger problem won’t be a new drug. It will have to be an entirely new approach to the diagnosis of infectious disease, to the use of antibiotics, and to the monitoring of antimicrobial resistance (AMR)—all enabled by new technology.
Keeping an Eye Out for Outbreaks
Like others who are leading the fight against AMR, Dr. Steven Solomon has no illusions about the difficulty of the challenge. “It is the single most complex problem in all of medicine and public health—far outpacing the complexity and the difficulty of any other problem that we face,” says Solomon, who is a global health consultant and former director of the CDC’s Office of Antimicrobial Resistance.
Solomon wants to take the battle against AMR beyond the laboratory. In his view, surveillance—tracking and analyzing various data on AMR—is critical, particularly given how quickly and widely it spreads. But surveillance efforts are currently fraught with shortcomings. The available data is fragmented and often not comparable. Hospitals fail to collect the representative samples necessary for surveillance analytics, collecting data only on those patients who experience resistance and not on those who get better. Laboratories use a wide variety of testing methods, and reporting is not always consistent or complete.
Surveillance can serve as an early warning system. But weaknesses in these systems have caused public health officials to consistently underestimate the impact of AMR in loss of lives and financial costs. That’s why improving surveillance must be a top priority, says Solomon, who previously served as chair of the U.S. Federal Interagency Task Force on AMR and has been tracking the advance of AMR since he joined the U.S. Public Health Service in 1981.
A Collaborative Diagnosis
Ineffective surveillance has also contributed to huge growth in the use of antibiotics when they aren’t warranted. Strong patient demand and financial incentives for prescribing physicians are blamed for antibiotics abuse in China. India has become the largest consumer of antibiotics on the planet, in part because they are prescribed or sold for diarrheal diseases and upper respiratory infections for which they have limited value. And many countries allow individuals to purchase antibiotics over the counter, exacerbating misuse and overuse.
In the United States, antibiotics are improperly prescribed 50% of the time, according to CDC estimates. One study of adult patients visiting U.S. doctors to treat respiratory problems found that more than two-thirds of antibiotics were prescribed for conditions that were not infections at all or for infections caused by viruses—for which an antibiotic would do nothing. That’s 27 million courses of antibiotics wasted a year—just for respiratory problems—in the United States alone.
And even in countries where there are national guidelines for prescribing antibiotics, those guidelines aren’t always followed. A study published in medical journal Family Practice showed that Swedish doctors, both those trained in Sweden and those trained abroad, inconsistently followed rules for prescribing antibiotics.
Solomon strongly believes that, worldwide, doctors need to expand their use of technology in their offices or at the bedside to guide them through a more rational approach to antibiotic use. Doctors have traditionally been reluctant to adopt digital technologies, but Solomon thinks that the AMR crisis could change that. New digital tools could help doctors and hospitals integrate guidelines for optimal antibiotic prescribing into their everyday treatment routines.
“Human-computer interactions are critical, as the amount of information available on antibiotic resistance far exceeds the ability of humans to process it,” says Solomon. “It offers the possibility of greatly enhancing the utility of computer-assisted physician order entry (CPOE), combined with clinical decision support.” Healthcare facilities could embed relevant information and protocols at the point of care, guiding the physician through diagnosis and prescription and, as a byproduct, facilitating the collection and reporting of antibiotic use.
Cincinnati Children’s Hospital’s antibiotic stewardship division has deployed a software program that gathers information from electronic medical records, order entries, computerized laboratory and pathology reports, and more. The system measures baseline antimicrobial use, dosing, duration, costs, and use patterns. It also analyzes bacteria and trends in their susceptibilities and helps with clinical decision making and prescription choices. The goal, says Dr. David Haslam, who heads the program, is to decrease the use of “big gun” super antibiotics in favor of more targeted treatment.
While this approach is not yet widespread, there is consensus that incorporating such clinical-decision support into electronic health records will help improve quality of care, contain costs, and reduce overtreatment in healthcare overall—not just in AMR. A 2013 randomized clinical trial finds that doctors who used decision-support tools were significantly less likely to order antibiotics than those in the control group and prescribed 50% fewer broad-spectrum antibiotics.
Putting mobile devices into doctors’ hands could also help them accept decision support, believes Solomon. Last summer, Scotland’s National Health Service developed an antimicrobial companion app to give practitioners nationwide mobile access to clinical guidance, as well as an audit tool to support boards in gathering data for local and national use.
“The immediacy and the consistency of the input to physicians at the time of ordering antibiotics may significantly help address the problem of overprescribing in ways that less-immediate interventions have failed to do,” Solomon says. In addition, handheld devices with so-called lab-on-a-chip technology could be used to test clinical specimens at the bedside and transmit the data across cellular or satellite networks in areas where infrastructure is more limited.
Artificial intelligence (AI) and machine learning can also become invaluable technology collaborators to help doctors more precisely diagnose and treat infection. In such a system, “the physician and the AI program are really ‘co-prescribing,’” says Solomon. “The AI can handle so much more information than the physician and make recommendations that can incorporate more input on the type of infection, the patient’s physiologic status and history, and resistance patterns of recent isolates in that ward, in that hospital, and in the community.”
Speed Is Everything
Growing bacteria in a dish has never appealed to Dr. James Davis, a computational biologist with joint appointments at Argonne National Laboratory and the University of Chicago Computation Institute. The first of a growing breed of computational biologists, Davis chose a PhD advisor in 2004 who was steeped in bioinformatics technology “because you could see that things were starting to change,” he says. He was one of the first in his microbiology department to submit a completely “dry” dissertation—that is, one that was all digital with nothing grown in a lab.
Upon graduation, Davis wanted to see if it was possible to predict whether an organism would be susceptible or resistant to a given antibiotic, leading him to explore the potential of machine learning to predict AMR.
As the availability of cheap computing power has gone up and the cost of genome sequencing has gone down, it has become possible to sequence a pathogen sample in order to detect its AMR resistance mechanisms. This could allow doctors to identify the nature of an infection in minutes instead of hours or days, says Davis.
Davis is part of a team creating a giant database of bacterial genomes with AMR metadata for the Pathosystems Resource Integration Center (PATRIC), funded by the U.S. National Institute of Allergy and Infectious Diseases to collect data on priority pathogens, such as tuberculosis and gonorrhea.
Because the current inability to identify microbes quickly is one of the biggest roadblocks to making an accurate diagnosis, the team’s work is critically important. The standard method for identifying drug resistance is to take a sample from a wound, blood, or urine and expose the resident bacteria to various antibiotics. If the bacterial colony continues to divide and thrive despite the presence of a normally effective drug, it indicates resistance. The process typically takes between 16 and 20 hours, itself an inordinate amount of time in matters of life and death. For certain strains of antibiotic-resistant tuberculosis, though, such testing can take a week. While physicians are waiting for test results, they often prescribe broad-spectrum antibiotics or make a best guess about what drug will work based on their knowledge of what’s happening in their hospital, “and in the meantime, you either get better,” says Davis, “or you don’t.”
At PATRIC, researchers are using machine-learning classifiers to identify regions of the genome involved in antibiotic resistance that could form the foundation for a “laboratory free” process for predicting resistance. Being able to identify the genetic mechanisms of AMR and predict the behavior of bacterial pathogens without petri dishes could inform clinical decision making and improve reaction time. Thus far, the researchers have developed machine-learning classifiers for identifying antibiotic resistance in Acinetobacter baumannii (a big player in hospital-acquired infection), methicillin-resistant Staphylococcus aureus (a.k.a. MRSA, a worldwide problem), and Streptococcus pneumoniae (a leading cause of bacterial meningitis), with accuracies ranging from 88% to 99%.
Houston Methodist Hospital, which uses the PATRIC database, is researching multidrug-resistant bacteria, specifically MRSA. Not only does resistance increase the cost of care, but people with MRSA are 64% more likely to die than people with a nonresistant form of the infection, according to WHO. Houston Methodist is investigating the molecular genetic causes of drug resistance in MRSA in order to identify new treatment approaches and help develop novel antimicrobial agents.
The Hunt for a New Class of Antibiotics
There are antibiotic-resistant bacteria, and then there’s Clostridium difficile—a.k.a. C. difficile—a bacterium that attacks the intestines even in young and healthy patients in hospitals after the use of antibiotics.
It is because of C. difficile that Dr. L. Clifford McDonald jumped into the AMR fight. The epidemiologist was finishing his work analyzing the spread of SARS in Toronto hospitals in 2004 when he turned his attention to C. difficile, convinced that the bacteria would become more common and more deadly. He was right, and today he’s at the forefront of treating the infection and preventing the spread of AMR as senior advisor for science and integrity in the CDC’s Division of Healthcare Quality Promotion. “[AMR] is an area that we’re funding heavily…insofar as the CDC budget can fund anything heavily,” says McDonald, whose group has awarded $14 million in contracts for innovative anti-AMR approaches.
Developing new antibiotics is a major part of the AMR battle. The majority of new antibiotics developed in recent years have been variations of existing drug classes. It’s been three decades since the last new class of antibiotics was introduced. Less than 5% of venture capital in pharmaceutical R&D is focused on antimicrobial development. A 2008 study found that less than 10% of the 167 antibiotics in development at the time had a new “mechanism of action” to deal with multidrug resistance. “The low-hanging fruit [of antibiotic development] has been picked,” noted a WHO report.
Researchers will have to dig much deeper to develop novel medicines. Machine learning could help drug developers sort through much larger data sets and go about the capital-intensive drug development process in a more prescriptive fashion, synthesizing those molecules most likely to have an impact.
McDonald believes that it will become easier to find new antibiotics if we gain a better understanding of the communities of bacteria living in each of us—as many as 1,000 different types of microbes live in our intestines, for example. Disruption to those microbial communities—our “microbiome”—can herald AMR. McDonald says that Big Data and machine learning will be needed to unlock our microbiomes, and that’s where much of the medical community’s investment is going.
He predicts that within five years, hospitals will take fecal samples or skin swabs and sequence the microorganisms in them as a kind of pulse check on antibiotic resistance. “Just doing the bioinformatics to sort out what’s there and the types of antibiotic resistance that might be in that microbiome is a Big Data challenge,” McDonald says. “The only way to make sense of it, going forward, will be advanced analytic techniques, which will no doubt include machine learning.”
Reducing Resistance on the Farm
Bringing information closer to where it’s needed could also help reduce agriculture’s contribution to the antibiotic resistance problem. Antibiotics are widely given to livestock to promote growth or prevent disease. In the United States, more kilograms of antibiotics are administered to animals than to people, according to data from the FDA.
One company has developed a rapid, on-farm diagnostics tool to provide livestock producers with more accurate disease detection to make more informed management and treatment decisions, which it says has demonstrated a 47% to 59% reduction in antibiotic usage. Such systems, combined with pressure or regulations to reduce antibiotic use in meat production, could also help turn the AMR tide.
Breaking Down Data Silos Is the First Step
Adding to the complexity of the fight against AMR is the structure and culture of the global healthcare system itself. Historically, healthcare has been a siloed industry, notorious for its scattered approach focused on transactions rather than healthy outcomes or the true value of treatment. There’s no definitive data on the impact of AMR worldwide; the best we can do is infer estimates from the information that does exist.
The biggest issue is the availability of good data to share through mobile solutions, to drive HCI clinical-decision support tools, and to feed supercomputers and machine-learning platforms. “We have a fragmented healthcare delivery system and therefore we have fragmented information. Getting these sources of data all into one place and then enabling them all to talk to each other has been problematic,” McDonald says.
Collecting, integrating, and sharing AMR-related data on a national and ultimately global scale will be necessary to better understand the issue. HCI and mobile tools can help doctors, hospitals, and public health authorities collect more information while advanced analytics, machine learning, and in-memory computing can enable them to analyze that data in close to real time. As a result, we’ll better understand patterns of resistance from the bedside to the community and up to national and international levels, says Solomon. The good news is that new technology capabilities like AI and new potential streams of data are coming online as an era of data sharing in healthcare is beginning to dawn, adds McDonald.
The ideal goal is a digitally enabled virtuous cycle of information and treatment that could save millions of dollars, lives, and perhaps even civilization if we can get there. D!
Despite the progress made in some countries, I am also aware of others that are still resistant to digitizing their economy and automating operations. What’s the difference between firms that are digital leaders and those that are slow to mature? From my perspective in working with a variety of businesses throughout Europe, it’s a combination of diversity and technology availability.
European companies are hardly homogenous. Comprising 47 countries across the continent, they serve communities that speak any of 225 spoken languages. Each one is experiencing various stages of digital development, economic stability, and workforce needs.
Nevertheless, as a whole, European firms do prioritize customer acquisition as well as improving efficiency and reducing costs. Over one-third of small and midsize companies are investing in collaboration software, customer relationship management solutions, e-commerce platforms, analytics, and talent management applications. Steadily, business leaders are finding better ways to go beyond data collection by applying predictive analytics to gain real-time insight from predictive analytics and machine learning to automate processes where possible.
Small and midsize businesses have a distinct advantage in this area over their larger rivals because they can, by nature, adopt new technology and practices quickly and act on decisions with greater agility. Nearly two-thirds (64%) of European firms are embracing the early stages of digitalization and planning to mature over time. Yet, the level of adoption depends solely on the leadership team’s commitment.
For many small and midsize companies across this region, the path to digital maturity resides in the cloud, more so than on-premise software deployment. For example, the flexibility associated with cloud deployment is viewed as a top attribute, especially among U.K. firms. This brings us back to the diversity of our region. Some countries prioritize personal data security while others may be more concerned with the ability to access the information they need in even the most remote of areas.
Technology alone does not deliver digital transformation
Digital transformation is certainly worth the effort for European firms. Between 60%–90% of small and midsize European businesses say their technology investments have met or exceeded their expectations – indicative of the steady, powerhouse transitions enabled by cloud computing. Companies are now getting the same access to the latest technology, data storage, and IT resources.
However, it is also important to note that a cloud platform is only as effective as the long-term digital strategy that it enables. To invigorate transformative changes, leadership needs to go beyond technology and adopt a mindset that embraces new ideas, tests the fitness of business models and processes continuously, and allows the flexibility to evolve the company as quickly as market dynamics change. By taking a step back and integrating digital objectives throughout the business strategy, leadership can pull together the elements needed to turn technology investments into differentiating, sustainable change. For example, the best talent with the right skills is hired. Plus, partners and suppliers with a complementary or shared digital vision and capability are onboarded.
The IDC Infobrief confirms what I have known all along: Small and midsize businesses are beginning to digitally mature and maintain a strategy that is relevant to their end-to-end processes. And furthering their digital transformation go hand in hand with the firms’ ability to ignite a transformational force that will likely progress Europe’s culture, social structure, and economy.