(Re)Programming Life

Kai Goerlich

We live in the Anthropocene era; human activity is very clearly the foremost impact on Earth. Today we require the resources of 1.6 Earths to survive, and the most moderate estimates suggest that, if current trends continue, we’ll need the equivalent of two Earths to support us.

At the same time, we are perfecting the ability to alter our ecosystems at the most fundamental level – DNA and RNA – that could theoretically reverse some of the damage we’ve done, or at least stem the continuing loss of biodiversity and habitat. Both are seen as posing great risk for our future, according to the World Economic Forum’s Global Risk Report. Of course, our quickly advancing genomic capabilities come with some difficult ethical questions.

However, gene editing will also introduce new possibilities for companies to create new lines of revenue and protect existing ones by making it possible to protect biodiversity, more safely manage ecosystem loss, and sustain agricultural production. A recent article in Nature points out that genome editing, for example, “allows much smaller changes to be made to DNA compared with conventional genetic engineering,” which might prove more palatable to the public and regulators.

The DNA revolution

The discovery in 1958 by James Watson, Francis Crick, and Rosalind Franklin of DNA as the primary building block of genetics had a major impact on how we study and interact with the world around us. The focus shifted from the analysis of plant and animal anatomy and exploring nature to the examination of life at the micro level. Over the following decades, humans have developed a comprehensive understanding of molecular biology.

Once the roles of DNA and RNA became clear – DNA stores the information of life and RNA translates the code and regulates the translation – it was only a matter of time before we figured out how to take on the role of programmers as well. When Kary Mullis discovered a way to relatively quickly synthesize DNA with polymerase chain reaction (PCR) technology (also called molecular photocopying) in 1983, the race was on.

The Human Genome Project sequenced the first full human genome in 2003. At that time, it took the collaboration of 20 universities working for 13 years and spending roughly $3 billion to do it. Thanks to high-throughput computing and massively parallel sequencing technologies (NG), sequencing speed has more than doubled every two years and costs have continued to drop (the field is advancing faster than Moore’s Law). Last September, Veritas Genetics announced $1,000 full-genome sequencing, including interpretation, for participants in the Personal Genome Project, and it’s just a matter of time before individuals can get their genomes sequenced for $100 or less.

“What we observe is a turning point in life sciences and medicine. Today our ability to generate massive amounts of biological data of any species and individual is ahead of our capabilities to interpret this vast amount of information. Working as a researcher, or even as a clinician, can feel like listening to all symphonies from Haydn to Shostakovich in parallel and trying to make sense out of it. Creating standards to annotate and exchange the data, finding the right algorithms and analytics to turn those curated data into insights will be a major challenge in the near future,” says Dr. Péter Adorján, principal expert, Precision Medicine at SAP.

Engineering life

Sequencing genomes is one thing, editing genes in living organisms is a different thing altogether. For the past 15 years, we have possessed techniques to edit human DNA by using a disabled virus (known as a viral vector) to deliver new genetic data to a cell. However, the introduction of foreign genomic materials into cells is an imprecise process and comes with a number of logistical drawbacks.

Then along came CRISPR/Cas9. Discovered in 2005, CRISPR/Cas9 is a naturally occurring immune system found in a wide range of bacteria. In a biological version of “cut-and-paste” CRISPR is able to snip out a short sequence of an invading virus’ DNA and, when invaded again, use this sequence to bind to the virus DNA and cut it at a specific part of the sequence. Less than a decade after its discovery, scientists figured out how to harness CRISPR/Cas9 for genome editing.

The approach is currently being tested for treating disease and could soon be used to treat a wide range of disorders. Once CRISPR is fully tested, it could be used to remove faulty genomes in embryos, basically eradicating those genomes from the gene pool. Theoretically, this form of gene editing should improve the safety of gene modifications; changes could be better planned, executed, and reviewed.

“The accuracy of the CRISPR method is simply stunning. The resulting medicine will improve outcomes and reduce side effects for many gene-based healthcare problems,” says Dr. Adorján. “If it holds its promises, it will probably change medicine within 10 years more than what we have observed in the last 50 years. But the methodology will raise fundamental ethical issues of how we cope with genetic optimizations of embryos or modifying germline cells, which would impact not only the individual but all subsequent generations as well.”

The impact on society and business could be profound and broad. In healthcare, gene editing is already showing progress in treating diseases such as curing chronic infection with hepatitis B and addressing the shortage of organs for transplants, for example. A group of scientists in San Diego used gene editing to create a population of mosquitoes resistant to spreading malaria. As an article in Chemistry World stated: gene editing is now “more than just a science – it’s big business too.” The genome editing market is expected to reach $3.5 billion by 2019, according to Markets and Markets. DuPont is already growing in greenhouses corn and wheat plants edited with CRISPR in an effort to make drought-resistant corn and improve wheat yields. The company’s vice president for agricultural biotechnology has predicted that gene editing will introduce a new wave of products and profits. Novartis is working with gene-editing startups on using CRISPR for engineering immune cells and blood stem cells and as a research tool for drug discovery.

Such advances are likely decades off, but they raise important ethical questions that we will have to answer, since such editing could impact not only the host organism, but the larger ecosystem, for better or for worse. For example, how might a genetically edited mosquito population impact the rest of the ecosystem? While these new tools will provide us with novel ways of managing our impact on the world around us – say, solving world hunger or reversing climate change – and create new business opportunities, there are risks.

Beyond gene hacking

The future of digital biology will not play out only at the molecular level, though. It will advance in the context of the larger world. Because ecological systems are complex, fragile networks, even the smallest changes can have a dramatic impact. That means the gene editing alone will not be enough to better deal with humanity’s impact on the world.

But genomics technology isn’t advancing in isolation.

As we’ve pointed out in previous Digital Futures posts, our world will be increasingly populated with sensors and the advanced computing power to collect and analyze the data they produce. By linking our growing wealth of biological data with rapidly advancing sensor-facilitated data, research organizations and companies could develop a more complete understanding of our environment, from rainforests to oceans and agricultural systems, at the macro level as well.

Researchers are already developing chip-scale sensors that can placed unobtrusively in the environment to measure molecular changes that could be used for such purposes as real-time monitoring of environmental pollutants, detecting toxic leaks in an industrial plant, or detecting disease by analyzing a patient’s breath. The data from such advanced sensors could also enable researchers and organizations to model and measure the impact of changes at the molecular level on larger ecosystems, and vice versa, with applications for everything from environmental sustainability to biomedicine. That intelligence will put scientists and businesses in a much better position to manage humanity’s impact on the Earth and the economy, our own health, and even help to deal with ethical questions about the impact of gene editing.

Businesses in healthcare and those with high ecological footprints, like agriculture, fishing, wood, mining, and oil & gas, could use modern sensor and genome technology to improve their risk assessment, act more sustainably, and potentially find new business ideas as well.

In order to get to that point, we’ll need to take three key steps. First, we must digitize our existing and growing understanding of life on Earth – all the existing biological, paleontological, and geological collections we’ve gathered over the centuries  in order to make them more easily accessible. Then, using the power of sensors and analytics, we can begin to scan the environment to gather critical data on our ecosystems and the impact we have on them. Finally, using gene sequencing, we can begin to explore the changes we might make by editing things at the molecular level and simulate the outcomes on a macro scale.

A designer future?

Where will these advances take us? There are a number of possible scenarios.

  1. Limited, regulated usage: We might see a future where we would simply fix molecular flaws and allow gene editing in only very specific contexts in the healthcare industry. While technology for fast and effective DNA sequencing and editing would continue to advance, the applications would be available to a niche of professionals only. We might enable gene editing to create certain designer plants to cope with climate change, for example, but that application would be highly regulated.
  1. A hybrid approach: Broader acceptance of complex gene editing would allow us to more significantly alter the natural world, editing known life forms and perhaps designing new ones. Gene editing would still be preserved for professionals. Healthcare would embrace a hybrid approach of classical medicine and gene editing. Mankind would begin to experiment with ecosystem engineering based on advanced insight and study, generating ethical controversy and long-term disputes. Some regulations would emerge in sensitive areas.
  1. Wide acceptance: In a world where IT and technology are entirely democratized and gene editing is widely accepted, we could wake up to a second creation. In this scenario, gene editing would be allowed with little restriction, with toolkits available to consumers and professionals. The healthcare industry would apply gene editing on a grand scale, and designer plants and animals would become commonplace. But, thanks to an increasingly advanced understanding of how nature operates on a macro and micro level, we could better understand and manage the consequences.

Download the executive brief Gene Editing: Big Science, Big Business.

digibio

To learn more about how exponential technology will affect business and life, see Digital Futures in the Digitalist Magazine.

Comments

Kai Goerlich

About Kai Goerlich

Kai Goerlich is the Chief Futurist at SAP Innovation Center network His specialties include Competitive Intelligence, Market Intelligence, Corporate Foresight, Trends, Futuring and ideation.

Share your thoughts with Kai on Twitter @KaiGoe.heif Futu

Transform Or Die: What Will You Do In The Digital Economy?

Scott Feldman and Puneet Suppal

By now, most executives are keenly aware that the digital economy can be either an opportunity or a threat. The question is not whether they should engage their business in it. Rather, it’s how to unleash the power of digital technology while maintaining a healthy business, leveraging existing IT investments, and innovating without disrupting themselves.

Yet most of those executives are shying away Businesspeople in a Meeting --- Image by © Monalyn Gracia/Corbisfrom such a challenge. According to a recent study by MIT Sloan and Capgemini, only 15% of CEOs are executing a digital strategy, even though 90% agree that the digital economy will impact their industry. As these businesses ignore this reality, early adopters of digital transformation are achieving 9% higher revenue creation, 26% greater impact on profitability, and 12% more market valuation.

Why aren’t more leaders willing to transform their business and seize the opportunity of our hyperconnected world? The answer is as simple as human nature. Innately, humans are uncomfortable with the notion of change. We even find comfort in stability and predictability. Unfortunately, the digital economy is none of these – it’s fast and always evolving.

Digital transformation is no longer an option – it’s the imperative

At this moment, we are witnessing an explosion of connections, data, and innovations. And even though this hyperconnectivity has changed the game, customers are radically changing the rules – demanding simple, seamless, and personalized experiences at every touch point.

Billions of people are using social and digital communities to provide services, share insights, and engage in commerce. All the while, new channels for engaging with customers are created, and new ways for making better use of resources are emerging. It is these communities that allow companies to not only give customers what they want, but also align efforts across the business network to maximize value potential.

To seize the opportunities ahead, businesses must go beyond sensors, Big Data, analytics, and social media. More important, they need to reinvent themselves in a manner that is compatible with an increasingly digital world and its inhabitants (a.k.a. your consumers).

Here are a few companies that understand the importance of digital transformation – and are reaping the rewards:

  1. Under Armour:  No longer is this widely popular athletic brand just selling shoes and apparel. They are connecting 38 million people on a digital platform. By focusing on this services side of the business, Under Armour is poised to become a lifestyle advisor and health consultant, using his product side as the enabler.
  1. Port of Hamburg: Europe’s second-largest port is keeping carrier trucks and ships productive around the clock. By fusing facility, weather, and traffic conditions with vehicle availability and shipment schedules, the Port increased container handling capacity by 178% without expanding its physical space.
  1. Haier Asia: This top-ranking multinational consumer electronics and home appliances company decided to disrupt itself before someone else did. The company used a two-prong approach to digital transformation to create a service-based model to seize the potential of changing consumer behaviors and accelerate product development. 
  1. Uber: This startup darling is more than just a taxi service. It is transforming how urban logistics operates through a technology trifecta: Big Data, cloud, and mobile.
  1. American Society of Clinical Oncologists (ASCO): Even nonprofits can benefit from digital transformation. ASCO is transforming care for cancer patients worldwide by consolidating patient information with its CancerLinQ. By unlocking knowledge and value from the 97% of cancer patients who are not involved in clinical trials, healthcare providers can drive better, more data-driven decision making and outcomes.

It’s time to take action 

During the SAP Executive Technology Summit at SAP TechEd on October 19–20, an elite group of CIOs, CTOs, and corporate executives will gather to discuss the challenges of digital transformation and how they can solve them. With the freedom of open, candid, and interactive discussions led by SAP Board Members and senior technology leadership, delegates will exchange ideas on how to get on the right path while leveraging their existing technology infrastructure.

Stay tuned for exclusive insights from this invitation-only event in our next blog!
Scott Feldman is Global Head of the SAP HANA Customer Community at SAP. Connect with him on Twitter @sfeldman0.

Puneet Suppal drives Solution Strategy and Adoption (Customer Innovation & IoT) at SAP Labs. Connect with him on Twitter @puneetsuppal.

 

Comments

Scott Feldman and Puneet Suppal

About Scott Feldman and Puneet Suppal

Scott Feldman is the Head of SAP HANA International Customer Community. Puneet Suppal is the Customer Co-Innovation & Solution Adoption Executive at SAP.

What Is Digital Transformation?

Andreas Schmitz

Achieving quantum leaps through disruption and using data in new contexts, in ways designed for more than just Generation Y — indeed, the digital transformation affects us all. It’s time for a detailed look at its key aspects.

Data finding its way into new settings

Archiving all of a company’s internal information until the end of time is generally a good idea, as it gives the boss the security that nothing will be lost. Meanwhile, enabling him or her to create bar graphs and pie charts based on sales trends – preferably in real time, of course – is even better.

But the best scenario of all is when the boss can incorporate data from external sources. All of a sudden, information on factors as seemingly mundane as the weather start helping to improve interpretations of fluctuations in sales and to make precise modifications to the company’s offerings. When the gusts of autumn begin to blow, for example, energy providers scale back solar production and crank up their windmills. Here, external data provides a foundation for processes and decisions that were previously unattainable.

Quantum leaps possible through disruption

While these advancements involve changes in existing workflows, there are also much more radical approaches that eschew conventional structures entirely.

“The aggressive use of data is transforming business models, facilitating new products and services, creating new processes, generating greater utility, and ushering in a new culture of management,” states Professor Walter Brenner of the University of St. Gallen in Switzerland, regarding the effects of digitalization.

Harnessing these benefits requires the application of innovative information and communication technology, especially the kind termed “disruptive.” A complete departure from existing structures may not necessarily be the actual goal, but it can occur as a consequence of this process.

Having had to contend with “only” one new technology at a time in the past, be it PCs, SAP software, SQL databases, or the Internet itself, companies are now facing an array of concurrent topics, such as the Internet of Things, social media, third-generation e-business, and tablets and smartphones. Professor Brenner thus believes that every good — and perhaps disruptive — idea can result in a “quantum leap in terms of data.”

Products and services shaped by customers

It has already been nearly seven years since the release of an app that enables customers to order and pay for taxis. Initially introduced in Berlin, Germany, mytaxi makes it possible to avoid waiting on hold for the next phone representative and pay by credit card while giving drivers greater independence from taxi dispatch centers. In addition, analyses of user data can lead to the creation of new services, such as for people who consistently order taxis at around the same time of day.

“Successful models focus on providing utility to the customer,” Professor Brenner explains. “In the beginning, at least, everything else is secondary.”

In this regard, the private taxi agency Uber is a fair bit more radical. It bypasses the entire taxi industry and hires private individuals interested in making themselves and their vehicles available for rides on the Uber platform. Similarly, Airbnb runs a platform travelers can use to book private accommodations instead of hotel rooms.

Long-established companies are also undergoing profound changes. The German publishing house Axel Springer SE, for instance, has acquired a number of startups, launched an online dating platform, and released an app with which users can collect points at retail. Chairman and CEO Matthias Döpfner also has an interest in getting the company’s newspapers and other periodicals back into the black based on payment models, of course, but these endeavors are somewhat at odds with the traditional notion of publishing houses being involved solely in publishing.

The impact of digitalization transcends Generation Y

Digitalization is effecting changes in nearly every industry. Retailers will likely have no choice but to integrate their sales channels into an omnichannel approach. Seeking to make their data services as attractive as possible, BMW, Mercedes, and Audi have joined forces to purchase the digital map service HERE. Mechanical engineering companies are outfitting their equipment with sensors to reduce downtime and achieve further product improvements.

“The specific potential and risks at hand determine how and by what means each individual company approaches the subject of digitalization,” Professor Brenner reveals. The resulting services will ultimately benefit every customer – not just those belonging to Generation Y, who present a certain basic affinity for digital methods.

“Think of cars that notify the service center when their brakes or drive belts need to be replaced, offer parking assistance, or even handle parking for you,” Brenner offers. “This can be a big help to elderly people in particular.”

Chief digital officers: team members, not miracle workers

Making the transition to the digital future is something that involves not only a CEO or a head of marketing or IT, but the entire company. Though these individuals do play an important role as proponents of digital models, it also takes more than just a chief digital officer alone.

For Professor Brenner, appointing a single person to the board of a DAX company to oversee digitalization is basically absurd. “Unless you’re talking about Da Vinci or Leibnitz born again, nobody could handle such a task,” he states.

In Brenner’s view, this is a topic for each and every department, and responsibilities should be assigned much like on a soccer field: “You’ve got a coach and the players – and the fans, as well, who are more or less what it’s all about.”

Here, the CIO neither competes with the CDO nor assumes an elevated position in the process of digital transformation. Implementing new databases like SAP HANA or Hadoop, leveraging sensor data in both technical and commercially viable ways, these are the tasks CIOs will face going forward.

“There are some fantastic jobs out there,” Brenner affirms.

Want more insight on managing digital transformation? See Three Keys To Winning In A World Of Disruption.

Image via Shutterstock

Comments

Andreas Schmitz

About Andreas Schmitz

Andreas Schmitz is a Freelance Journalist for SAP, covering a wide range of topics from big data to Internet of Things, HR, business innovation and mobile.

Human Skills for the Digital Future

Dan Wellers and Kai Goerlich

Technology Evolves.
So Must We.


Technology replacing human effort is as old as the first stone axe, and so is the disruption it creates.
Thanks to deep learning and other advances in AI, machine learning is catching up to the human mind faster than expected.
How do we maintain our value in a world in which AI can perform many high-value tasks?


Uniquely Human Abilities

AI is excellent at automating routine knowledge work and generating new insights from existing data — but humans know what they don’t know.

We’re driven to explore, try new and risky things, and make a difference.
 
 
 
We deduce the existence of information we don’t yet know about.
 
 
 
We imagine radical new business models, products, and opportunities.
 
 
 
We have creativity, imagination, humor, ethics, persistence, and critical thinking.


There’s Nothing Soft About “Soft Skills”

To stay ahead of AI in an increasingly automated world, we need to start cultivating our most human abilities on a societal level. There’s nothing soft about these skills, and we can’t afford to leave them to chance.

We must revamp how and what we teach to nurture the critical skills of passion, curiosity, imagination, creativity, critical thinking, and persistence. In the era of AI, no one will be able to thrive without these abilities, and most people will need help acquiring and improving them.

Anything artificial intelligence does has to fit into a human-centered value system that takes our unique abilities into account. While we help AI get more powerful, we need to get better at being human.


Download the executive brief Human Skills for the Digital Future.


Read the full article The Human Factor in an AI Future.


Comments

Dan Wellers

About Dan Wellers

Dan Wellers is founder and leader of Digital Futures at SAP, a strategic insights and thought leadership discipline that explores how digital technologies drive exponential change in business and society.

Kai Goerlich

About Kai Goerlich

Kai Goerlich is the Chief Futurist at SAP Innovation Center network His specialties include Competitive Intelligence, Market Intelligence, Corporate Foresight, Trends, Futuring and ideation.

Share your thoughts with Kai on Twitter @KaiGoe.heif Futu

Tags:

The Human Factor In An AI Future

Dan Wellers and Kai Goerlich

As artificial intelligence becomes more sophisticated and its ability to perform human tasks accelerates exponentially, we’re finally seeing some attempts to wrestle with what that means, not just for business, but for humanity as a whole.

From the first stone ax to the printing press to the latest ERP solution, technology that reduces or even eliminates physical and mental effort is as old as the human race itself. However, that doesn’t make each step forward any less uncomfortable for the people whose work is directly affected – and the rise of AI is qualitatively different from past developments.

Until now, we developed technology to handle specific routine tasks. A human needed to break down complex processes into their component tasks, determine how to automate each of those tasks, and finally create and refine the automation process. AI is different. Because AI can evaluate, select, act, and learn from its actions, it can be independent and self-sustaining.

Some people, like investor/inventor Elon Musk and Alibaba founder and chairman Jack Ma, are focusing intently on how AI will impact the labor market. It’s going to do far more than eliminate repetitive manual jobs like warehouse picking. Any job that involves routine problem-solving within existing structures, processes, and knowledge is ripe for handing over to a machine. Indeed, jobs like customer service, travel planning, medical diagnostics, stock trading, real estate, and even clothing design are already increasingly automated.

As for more complex problem-solving, we used to think it would take computers decades or even centuries to catch up to the nimble human mind, but we underestimated the exponential explosion of deep learning. IBM’s Watson trounced past Jeopardy champions in 2011 – and just last year, Google’s DeepMind AI beat the reigning European champion at Go, a game once thought too complex for even the most sophisticated computer.

Where does AI leave human?

This raises an urgent question for the future: How do human beings maintain our economic value in a world in which AI will keep getting better than us at more and more things?

The concept of the technological singularity – the point at which machines attain superhuman intelligence and permanently outpace the human mind – is based on the idea that human thinking can’t evolve fast enough to keep up with technology. However, the limits of human performance have yet to be found. It’s possible that people are only at risk of lagging behind machines because nothing has forced us to test ourselves at scale.

Other than a handful of notable individual thinkers, scientists, and artists, most of humanity has met survival-level needs through mostly repetitive tasks. Most people don’t have the time or energy for higher-level activities. But as the human race faces the unique challenge of imminent obsolescence, we need to think of those activities not as luxuries, but as necessities. As technology replaces our traditional economic value, the economic system may stop attaching value to us entirely unless we determine the unique value humanity offers – and what we can and must do to cultivate the uniquely human skills that deliver that value.

Honing the human advantage

As a species, humans are driven to push past boundaries, to try new things, to build something worthwhile, and to make a difference. We have strong instincts to explore and enjoy novelty and risk – but according to psychologist Mihaly Csikszentmihalyi, these instincts crumble if we don’t cultivate them.

AI is brilliant at automating routine knowledge work and generating new insights from existing data. What it can’t do is deduce the existence, or even the possibility, of information it isn’t already aware of. It can’t imagine radical new products and business models. Or ask previously unconceptualized questions. Or envision unimagined opportunities and achievements. AI doesn’t even have common sense! As theoretical physicist Michio Kaku says, a robot doesn’t know that water is wet or that strings can pull but not push. Nor can robots engage in what Kaku calls “intellectual capitalism” – activities that involve creativity, imagination, leadership, analysis, humor, and original thought.

At the moment, though, we don’t generally value these so-called “soft skills” enough to prioritize them. We expect people to develop their competency in emotional intelligence, cross-cultural awareness, curiosity, critical thinking, and persistence organically, as if these skills simply emerge on their own given enough time. But there’s nothing soft about these skills, and we can’t afford to leave them to chance.

Lessons in being human

To stay ahead of AI in an increasingly automated world, we need to start cultivating our most human abilities on a societal level – and to do so not just as soon as possible, but as early as possible.

Singularity University chairman Peter Diamandis, for example, advocates revamping the elementary school curriculum to nurture the critical skills of passion, curiosity, imagination, critical thinking, and persistence. He envisions a curriculum that, among other things, teaches kids to communicate, ask questions, solve problems with creativity, empathy, and ethics, and accept failure as an opportunity to try again. These concepts aren’t necessarily new – Waldorf and Montessori schools have been encouraging similar approaches for decades – but increasing automation and digitization make them newly relevant and urgent.

The Mastery Transcript Consortium is approaching the same problem from the opposite side, by starting with outcomes. This organization is pushing to redesign the secondary school transcript to better reflect whether and how high school students are acquiring the necessary combination of creative, critical, and analytical abilities. By measuring student achievement in a more nuanced way than through letter grades and test scores, the consortium’s approach would inherently require schools to reverse-engineer their curricula to emphasize those abilities.

Most critically, this isn’t simply a concern of high-tuition private schools and “good school districts” intended to create tomorrow’s executives and high-level knowledge workers. One critical aspect of the challenge we face is the assumption that the vast majority of people are inevitably destined for lives that don’t require creativity or critical thinking – that either they will somehow be able to thrive anyway or their inability to thrive isn’t a cause for concern. In the era of AI, no one will be able to thrive without these abilities, which means that everyone will need help acquiring them. For humanitarian, political, and economic reasons, we cannot just write off a large percentage of the population as disposable.

In the end, anything an AI does has to fit into a human-centered value system that takes our unique human abilities into account. Why would we want to give up our humanity in favor of letting machines determine whether or not an action or idea is valuable? Instead, while we let artificial intelligence get better at being what it is, we need to get better at being human. That’s how we’ll keep coming up with groundbreaking new ideas like jazz music, graphic novels, self-driving cars, blockchain, machine learning – and AI itself.

Read the executive brief Human Skills for the Digital Future.

Build an intelligent enterprise with AI and machine learning to unite human expertise and computer insights. Run live with SAP Leonardo.


Comments

Dan Wellers

About Dan Wellers

Dan Wellers is founder and leader of Digital Futures at SAP, a strategic insights and thought leadership discipline that explores how digital technologies drive exponential change in business and society.

Kai Goerlich

About Kai Goerlich

Kai Goerlich is the Chief Futurist at SAP Innovation Center network His specialties include Competitive Intelligence, Market Intelligence, Corporate Foresight, Trends, Futuring and ideation.

Share your thoughts with Kai on Twitter @KaiGoe.heif Futu