The Role Of Imagination In Creating The Next Set Of Breakthrough Innovations

Mukesh Gupta

I stumbled across “As We May Think,” an article that Director of the Office of Scientific Research and Development Vannevar Bush wrote in 1945.

This article validates for me the role that imagination can play in the innovation process. Dr. Bush was able to predict in part what the world would look and feel like many years into the future. Here’s what I learned from his approach to ideation and how we can use it to assist in our own quest for using imagination to come up with innovations of the future.

Before trying to predict the future, understand the present

First, Bush thoroughly analyzed the present day (mid-1940s) situation, including what WWII had fostered and what it hindered, from the perspective of scientific inquiry and progress. He shared his thoughts on the state of scientific research and where science had seen progress and where it stood still.

Identify potentialities by extrapolation

He then extrapolated into the future by identifying the potentialities in the progress made, and shared what he though would happen in the near-to-short term if things had continued on the same trajectory. This is where he talked about immediate and imminent progress based on what was already happening. Most futurists and trend predictors use this process to forecast their trends.

Now, let your imagination fly

Once he built a good, solid foundation by identifying the progress made and what was expected in the near-to-short term, he then allowed his imagination to take flight. He talked about the camera becoming so small that someone would could carry one strapped onto their foreheads (sounds to me like a GoPro):

The camera hound of the future wears on his forehead a lump a little larger than a walnut.

He then explored and explained what the film and printing process would look like:

Often it would be advantageous to be able to snap the camera and to look at the picture immediately.

He imagined advances in micro-film technology that would enable the whole of Encyclopedia Britannica (one of the largest book collections at that time) to be available on something the size of a matchbox.

The Encyclopedia Britannica could be reduced to the volume of a matchbox. A library of a million volumes could be compressed into one end of a desk. If the human race has produced since the invention of movable type a total record, in the form of magazines, newspapers, books, tracts, advertising blurbs, correspondence, having a volume corresponding to a billion books, the whole affair, assembled and compressed, could be lugged off in a moving van.

The material for the microfilm Britannica would cost a nickel, and it could be mailed anywhere for a cent.

In addition to storing all of this knowledge in a small size, he said it is also important to create new knowledge and do so in an easy and simple way. He talked about a device into which someone speaks (in a specific way) and the device converts this into the appropriate text (sounds a lot like voice-to-text devices – Siri?)

To make the record, we now push a pencil or tap a typewriter. Then comes the process of digestion and correction, followed by an intricate process of typesetting, printing, and distribution. To consider the first stage of the procedure, will the author of the future cease writing by hand or typewriter and talk directly to the record? He does so indirectly, by talking to a stenographer or a wax cylinder; but the elements are all present if he wishes to have his talk directly produce a typed record. All he needs to do is to take advantage of existing mechanisms and to alter his language.

He then took flight in his imagination to put all of this together and predict what it would feel like to live in an era with such devices:

One can now picture a future investigator in his laboratory. His hands are free, and he is not anchored. As he moves about and observes, he photographs and comments. Time is automatically recorded to tie the two records together. If he goes into the field, he may be connected by radio to his recorder. As he ponders over his notes in the evening, he again talks his comments into the record. His typed record, as well as his photographs, may both be in miniature, so that he projects them for examination.

He acknowledged that a lot would need to happen between 1945’s reality and his imagined reality, but he was confident that it was all possible. He showed how past progress implied that the pace of innovation and creativity would only accelerate, meaning his imagined reality would not be very far from the time he was writing the piece.

Next he described mathematical inquiry and his definition of a mathematician:

A mathematician is not a man who can readily manipulate figures; often he cannot. He is not even a man who can readily perform the transformations of equations by the use of calculus. He is primarily an individual who is skilled in the use of symbolic logic on a high plane, and especially he is a man of intuitive judgment in the choice of the manipulative processes he employs.

This is probably the closest definition that I have come across for a data scientist. Bush said machines would do actual mathematical calculations and enable the mathematician to think about a higher order of logic. He also understood that the potential of such a machine is not limited to the scientist.

The scientist, however, is not the only person who manipulates data and examines the world about him by the use of logical processes, although he sometimes preserves this appearance by adopting into the fold anyone who becomes logical, much in the manner in which a British labor leader is elevated to knighthood. Whenever logical processes of thought are employed – that is, whenever thought for a time runs along an accepted groove – there is an opportunity for the machine. Formal logic used to be a keen instrument in the hands of the teacher in his trying of students’ souls. It is readily possible to construct a machine which will manipulate premises in accordance with formal logic, simply by the clever use of relay circuits. Put a set of premises into such a device and turn the crank, and it will readily pass out conclusion after conclusion, all in accordance with logical law, and with no more slips than would be expected of a keyboard adding machine.

I think this sounds like a general purpose computer or even a smartphone. He then goes on to imagine how a retail store could be run if all these innovations became a reality. It sounds a lot like an ERP system running the entire store and its operations.

He also predicted that machines can be taught to learn and operate, not just on selection by indexing, but by association, and that machines would be able to beat humans (the story of IBM’s Watson winning Jeopardy?) – what’s called today machine learning.

Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve, for his records have relative permanency. The first idea, however, to be drawn from the analogy concerns selection. Selection by association, rather than indexing, may yet be mechanized. One cannot hope thus to equal the speed and flexibility with which the mind follows an associative trail, but it should be possible to beat the mind decisively in regard to the permanence and clarity of the items resurrected from storage.

He described a personal machine (he calls it “memex”) that stores all the information and data that we need as individuals (including all the knowledge that humans have accumulated over the centuries) and is available whenever a person wants it. Information would be accessible by associative indexing (sounds like hyperlinking to me), which would allow us to move across connected and relevant topics.

The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English longbow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. When it becomes evident that the elastic properties of available materials had a great deal to do with the bow, he branches off on a side trail which takes him through textbooks on elasticity and tables of physical constants. He inserts a page of longhand analysis of his own. Thus he builds a trail of his interest through the maze of materials available to him.

And his trails do not fade. Several years later, his talk with a friend turns to the queer ways in which a people resist innovations, even of vital interest. He has an example, in the fact that the outraged Europeans still failed to adopt the Turkish bow. In fact he has a trail on it. A touch brings up the code book. Tapping a few keys projects the head of the trail. A lever runs through it at will, stopping at interesting items, going off on side excursions. It is an interesting trail, pertinent to the discussion. So he sets a reproducer in action, photographs the whole trail out, and passes it to his friend for insertion in his own memex, there to be linked into the more general trail.

Sounds a lot like a combination of Google, Wikipedia, and Evernote to me.

He then goes on to talk about the fact that science is a tool that could create weapons and innovations that could not only enable humanity to keep track of its history, but create a completely new future as well.

Applied imagination

In a single 1945 article, Vannevar Bush imagined so many innovations that we enjoy today, seven decades later. He imagined things similar to GoPro, selfie sticks, Google Glass, ERP systems, digitized Encyclopedia Britannica, search engines, note-taking in the cloud, voice-to-text and text-to-voice conversions, personal computers, mobile phones, and much more.

This shows that if we start from the place where we are today, apply our imagination, and take leaps of faiths, we can imagine what the future will look like and then go after this future with all our current strengths.

This ability to imagine is critical for all of us who wish to be part of the generation of innovators who will define what and how our future shapes up.

How to develop this ability to imagine

In “The Real Neuroscience of Creativity,” Scott Barry Kaufman talks about three kinds of neural networks – the Executive Attention Network (activated when we need focused attention to do something specific), the Imagination Network (also called the default network), and the Salience Network (acts as the “switching” network and decides which neural network needs to be activated when).

… the Default Network (referred to here as the Imagination Network) is involved in “constructing dynamic mental simulations based on personal past experiences such as used during remembering, thinking about the future, and generally when imagining alternative perspectives and scenarios to the present.” The Imagination Network is also involved in social cognition. For instance, when we are imagining what someone else is thinking, this brain network is active. The Imagination Network involves areas deep inside the prefrontal cortex and temporal lobe (medial regions), along with communication with various outer and inner regions of the parietal cortex.

Conclusion

What this tells me is that the ability to imagine is inherently human and we are all capable of letting our imagination soar, if we want to.

So, the inability to imagine new or alternate realities is totally self-induced – and sometimes induced by our systems (e.g., education and even the culture of our organizations). This also means that it is in our very hands to set this right and start imagining alternate realities. The more we practice, the better we will get at it.

The more important it is for us to innovate and create, the more critical the skill to imagine alternate realities.

When Vannevar wrote this piece, it was a time where technological breakthroughs were imminent.

We are again at the same crossroads & technological breakthroughs are imminent. The question we need to now ask is:

Will we bring in the breakthroughs, or will we stand and wait for someone to do it for us?

PS: You can view a visual tour of Vannevar Bush’s Work below:

More predictions: AI will make customers and employees happier – as long as it learns to respect our boundaries. Learn more about Empathy: The Killer App for Artificial Intelligence.

Comments

Mukesh Gupta

About Mukesh Gupta

Mukesh Gupta previously held the role of Executive Liaison for the SAP User group in India. He worked as the bridge between the User group and SAP (Development, Consulting, Sales and product management).

How To Design Your Company’s Digital Transformation

Sam Yen

The September issue of the Harvard Business Review features a cover story on design thinking’s coming of age. We have been applying design thinking within SAP for the past 10 years, and I’ve witnessed the growth of this human-centered approach to innovation first hand.

Design thinking is, as the HBR piece points out, “the best tool we have for … developing a responsive, flexible organizational culture.”

This means businesses are doing more to learn about their customers by interacting directly with them. We’re seeing this change in our work on d.forum — a community of design thinking champions and “disruptors” from across industries.

Meanwhile, technology is making it possible to know exponentially more about a customer. Businesses can now make increasingly accurate predictions about customers’ needs well into the future. The businesses best able to access and pull insights from this growing volume of data will win. That requires a fundamental change for our own industry; it necessitates a digital transformation.

So, how do we design this digital transformation?

It starts with the customer and an application of design thinking throughout an organization – blending business, technology and human values to generate innovation. Business is already incorporating design thinking, as the HBR cover story shows. We in technology need to do the same.

Design thinking plays an important role because it helps articulate what the end customer’s experience is going to be like. It helps focus all aspects of the business on understanding and articulating that future experience.

Once an organization is able to do that, the insights from that consumer experience need to be drawn down into the business, with the central question becoming: What does this future customer experience mean for us as an organization? What barriers do we need to remove? Do we need to organize ourselves differently? Does our process need to change – if it does, how? What kind of new technology do we need?

Then an organization must look carefully at roles within itself. What does this knowledge of the end customer’s future experience mean for an individual in human resources, for example, or finance? Those roles can then be viewed as end experiences unto themselves, with organizations applying design thinking to learn about the needs inherent to those roles. They can then change roles to better meet the end customer’s future needs. This end customer-centered approach is what drives change.

This also means design thinking is more important than ever for IT organizations.

We, in the IT industry, have been charged with being responsive to business, using technology to solve the problems business presents. Unfortunately, business sometimes views IT as the organization keeping the lights on. If we make the analogy of a store: business is responsible for the front office, focused on growing the business where consumers directly interact with products and marketing; while the perception is that IT focuses on the back office, keeping servers running and the distribution system humming. The key is to have business and IT align to meet the needs of the front office together.

Remember what I said about the growing availability of consumer data? The business best able to access and learn from that data will win. Those of us in IT organizations have the technology to make that win possible, but the way we are seen and our very nature needs to change if we want to remain relevant to business and participate in crafting the winning strategy.

We need to become more front office and less back office, proving to business that we are innovation partners in technology.

This means, in order to communicate with businesses today, we need to take a design thinking approach. We in IT need to show we have an understanding of the end consumer’s needs and experience, and we must align that knowledge and understanding with technological solutions. When this works — when the front office and back office come together in this way — it can lead to solutions that a company could otherwise never have realized.

There’s different qualities, of course, between front office and back office requirements. The back office is the foundation of a company and requires robustness, stability, and reliability. The front office, on the other hand, moves much more quickly. It is always changing with new product offerings and marketing campaigns. Technology must also show agility, flexibility, and speed. The business needs both functions to survive. This is a challenge for IT organizations, but it is not an impossible shift for us to make.

Here’s the breakdown of our challenge.

1. We need to better understand the real needs of the business.

This means learning more about the experience and needs of the end customer and then translating that information into technological solutions.

2. We need to be involved in more of the strategic discussions of the business.

Use the regular invitations to meetings with business as an opportunity to surface the deeper learning about the end consumer and the technology solutions that business may otherwise not know to ask for or how to implement.

The IT industry overall may not have a track record of operating in this way, but if we are not involved in the strategic direction of companies and shedding light on the future path, we risk not being considered innovation partners for the business.

We must collaborate with business, understand the strategic direction and highlight the technical challenges and opportunities. When we do, IT will become a hybrid organization – able to maintain the back office while capitalizing on the front office’s growing technical needs. We will highlight solutions that business could otherwise have missed, ushering in a digital transformation.

Digital transformation goes beyond just technology; it requires a mindset. See What It Really Means To Be A Digital Organization.

This story originally appeared on SAP Business Trends.

Top image via Shutterstock

Comments

Sam Yen

About Sam Yen

Sam Yen is the Chief Design Officer for SAP and the Managing Director of SAP Labs Silicon Valley. He is focused on driving a renewed commitment to design and user experience at SAP. Under his leadership, SAP further strengthens its mission of listening to customers´ needs leading to tangible results, including SAP Fiori, SAP Screen Personas and SAP´s UX design services.

How Productive Could You Be With 45 Minutes More Per Day?

Michael Rander

Chances are that you are already feeling your fair share of organizational complexity when navigating your current company, but have you ever considered just how much time is spent across all companies on managing complexity? According to a recent study by the Economist Intelligence Unit (EIU), the global impact of complexity is mind-blowing – and not in a good way.

The study revealed that 38% of respondents spent 16%-25% of their time just dealing with organizational complexity, and 17% spent a staggering 26%-50% of their time doing so. To put that into more concrete numbers, in the US alone, if executives could cut their time spent managing complexity in half, an estimated 8.6 million hours could be saved a week. That corresponds to 45 minutes per executive per day.

The potential productivity impact of every executive having 45 minutes more to work every single day is clearly significant, and considering that 55% say that their organization is either very or extremely complex, why are we then not making the reduction of complexity one or our top of mind issues?

The problem is that identifying the sources of complexity is complex in of itself. Key sources of complexity include organizational size, executive priorities, pace of innovation, decision-making processes, vastly increasing amounts of data to manage, organizational structures, and the pure culture of the company. As a consequence, answers are not universal by any means.

That being said, the negative productivity impact of complexity, regardless of the specific source, is felt similarly across a very large segment of the respondents, with 55% stating that complexity has taken a direct toll on profitability over the past three years.  This is such a serious problem that 8% of respondents actually slowed down their company growth in order to deal with complexity.

So, if complexity oftentimes impacts productivity and subsequently profitability, what are some of the more successful initiatives that companies are taking to combat these effects? Among the answers from the EIU survey, the following were highlighted among the most likely initiatives to reduce complexity and ultimately increase productivity:

  • Making it a company-wide goal to reduce complexity means that the executive level has to live and breathe simplification in order for the rest of the organization to get behind it. Changing behaviors across the organization requires strong leadership, commitment, and change management, and these initiatives ultimately lead to improved decision-making processes, which was reported by respondents as the top benefit of reducing complexity. From a leadership perspective this also requires setting appropriate metrics for measuring outcomes, and for metrics, productivity and efficiency were by far the most popular choices amongst respondents though strangely collaboration related metrics where not ranking high in spite of collaboration being a high level priority.
  • Promoting a culture of collaboration means enabling employees and management alike to collaborate not only within their teams but also across the organization, with partners, and with customers. Creating cross-functional roles to facilitate collaboration was cited by 56% as the most helpful strategy in achieving this goal.
  • More than half (54%) of respondents found the implementation of new technology and tools to be a successful step towards reducing complexity and improving productivity. Enabling collaboration, reducing information overload, building scenarios and prognoses, and enabling real-time decision-making are all key issues that technology can help to reduce complexity at all levels of the organization.

While these initiatives won’t help everyone, it is interesting to see that more than half of companies believe that if they could cut complexity in half they could be at least 11%-25% more productive. That nearly one in five respondents indicated that they could be 26%-50% more productive is a massive improvement.

The question then becomes whether we can make complexity and its impact on productivity not only more visible as a key issue for companies to address, but (even more importantly) also something that every company and every employee should be actively working to reduce. The potential productivity gains listed by respondents certainly provide food for thought, and few other corporate activities are likely to gain that level of ROI.

Just imagine having 45 minutes each and every day for actively pursuing new projects, getting innovative, collaborating, mentoring, learning, reducing stress, etc. What would you do? The vision is certainly compelling, and the question is are we as companies, leaders, and employees going to do something about it?

To read more about the EIU study, please see:

Feel free to follow me on Twitter: @michaelrander

Comments

Michael Rander

About Michael Rander

Michael Rander is the Global Research Director for Future Of Work at SAP. He is an experienced project manager, strategic and competitive market researcher, operations manager as well as an avid photographer, athlete, traveler and entrepreneur. Share your thoughts with Michael on Twitter @michaelrander.

Diving Deep Into Digital Experiences

Kai Goerlich

 

Google Cardboard VR goggles cost US$8
By 2019, immersive solutions
will be adopted in 20% of enterprise businesses
By 2025, the market for immersive hardware and software technology could be $182 billion
In 2017, Lowe’s launched
Holoroom How To VR DIY clinics

From Dipping a Toe to Fully Immersed

The first wave of virtual reality (VR) and augmented reality (AR) is here,

using smartphones, glasses, and goggles to place us in the middle of 360-degree digital environments or overlay digital artifacts on the physical world. Prototypes, pilot projects, and first movers have already emerged:

  • Guiding warehouse pickers, cargo loaders, and truck drivers with AR
  • Overlaying constantly updated blueprints, measurements, and other construction data on building sites in real time with AR
  • Building 3D machine prototypes in VR for virtual testing and maintenance planning
  • Exhibiting new appliances and fixtures in a VR mockup of the customer’s home
  • Teaching medicine with AR tools that overlay diagnostics and instructions on patients’ bodies

A Vast Sea of Possibilities

Immersive technologies leapt forward in spring 2017 with the introduction of three new products:

  • Nvidia’s Project Holodeck, which generates shared photorealistic VR environments
  • A cloud-based platform for industrial AR from Lenovo New Vision AR and Wikitude
  • A workspace and headset from Meta that lets users use their hands to interact with AR artifacts

The Truly Digital Workplace

New immersive experiences won’t simply be new tools for existing tasks. They promise to create entirely new ways of working.

VR avatars that look and sound like their owners will soon be able to meet in realistic virtual meeting spaces without requiring users to leave their desks or even their homes. With enough computing power and a smart-enough AI, we could soon let VR avatars act as our proxies while we’re doing other things—and (theoretically) do it well enough that no one can tell the difference.

We’ll need a way to signal when an avatar is being human driven in real time, when it’s on autopilot, and when it’s owned by a bot.


What Is Immersion?

A completely immersive experience that’s indistinguishable from real life is impossible given the current constraints on power, throughput, and battery life.

To make current digital experiences more convincing, we’ll need interactive sensors in objects and materials, more powerful infrastructure to create realistic images, and smarter interfaces to interpret and interact with data.

When everything around us is intelligent and interactive, every environment could have an AR overlay or VR presence, with use cases ranging from gaming to firefighting.

We could see a backlash touting the superiority of the unmediated physical world—but multisensory immersive experiences that we can navigate in 360-degree space will change what we consider “real.”


Download the executive brief Diving Deep Into Digital Experiences.


Read the full article Swimming in the Immersive Digital Experience.

Comments

Kai Goerlich

About Kai Goerlich

Kai Goerlich is the Chief Futurist at SAP Innovation Center network His specialties include Competitive Intelligence, Market Intelligence, Corporate Foresight, Trends, Futuring and ideation. Share your thoughts with Kai on Twitter @KaiGoe.heif Futu

Tags:

Jenny Dearborn: Soft Skills Will Be Essential for Future Careers

Jenny Dearborn

The Japanese culture has always shown a special reverence for its elderly. That’s why, in 1963, the government began a tradition of giving a silver dish, called a sakazuki, to each citizen who reached the age of 100 by Keiro no Hi (Respect for the Elders Day), which is celebrated on the third Monday of each September.

That first year, there were 153 recipients, according to The Japan Times. By 2016, the number had swelled to more than 65,000, and the dishes cost the already cash-strapped government more than US$2 million, Business Insider reports. Despite the country’s continued devotion to its seniors, the article continues, the government felt obliged to downgrade the finish of the dishes to silver plating to save money.

What tends to get lost in discussions about automation taking over jobs and Millennials taking over the workplace is the impact of increased longevity. In the future, people will need to be in the workforce much longer than they are today. Half of the people born in Japan today, for example, are predicted to live to 107, making their ancestors seem fragile, according to Lynda Gratton and Andrew Scott, professors at the London Business School and authors of The 100-Year Life: Living and Working in an Age of Longevity.

The End of the Three-Stage Career

Assuming that advances in healthcare continue, future generations in wealthier societies could be looking at careers lasting 65 or more years, rather than at the roughly 40 years for today’s 70-year-olds, write Gratton and Scott. The three-stage model of employment that dominates the global economy today—education, work, and retirement—will be blown out of the water.

It will be replaced by a new model in which people continually learn new skills and shed old ones. Consider that today’s most in-demand occupations and specialties did not exist 10 years ago, according to The Future of Jobs, a report from the World Economic Forum.

And the pace of change is only going to accelerate. Sixty-five percent of children entering primary school today will ultimately end up working in jobs that don’t yet exist, the report notes.

Our current educational systems are not equipped to cope with this degree of change. For example, roughly half of the subject knowledge acquired during the first year of a four-year technical degree, such as computer science, is outdated by the time students graduate, the report continues.

Skills That Transcend the Job Market

Instead of treating post-secondary education as a jumping-off point for a specific career path, we may see a switch to a shorter school career that focuses more on skills that transcend a constantly shifting job market. Today, some of these skills, such as complex problem solving and critical thinking, are taught mostly in the context of broader disciplines, such as math or the humanities.

Other competencies that will become critically important in the future are currently treated as if they come naturally or over time with maturity or experience. We receive little, if any, formal training, for example, in creativity and innovation, empathy, emotional intelligence, cross-cultural awareness, persuasion, active listening, and acceptance of change. (No wonder the self-help marketplace continues to thrive!)

The three-stage model of employment that dominates the global economy today—education, work, and retirement—will be blown out of the water.

These skills, which today are heaped together under the dismissive “soft” rubric, are going to harden up to become indispensable. They will become more important, thanks to artificial intelligence and machine learning, which will usher in an era of infinite information, rendering the concept of an expert in most of today’s job disciplines a quaint relic. As our ability to know more than those around us decreases, our need to be able to collaborate well (with both humans and machines) will help define our success in the future.

Individuals and organizations alike will have to learn how to become more flexible and ready to give up set-in-stone ideas about how businesses and careers are supposed to operate. Given the rapid advances in knowledge and attendant skills that the future will bring, we must be willing to say, repeatedly, that whatever we’ve learned to that point doesn’t apply anymore.

Careers will become more like life itself: a series of unpredictable, fluid experiences rather than a tightly scripted narrative. We need to think about the way forward and be more willing to accept change at the individual and organizational levels.

Rethink Employee Training

One way that organizations can help employees manage this shift is by rethinking training. Today, overworked and overwhelmed employees devote just 1% of their workweek to learning, according to a study by consultancy Bersin by Deloitte. Meanwhile, top business leaders such as Bill Gates and Nike founder Phil Knight spend about five hours a week reading, thinking, and experimenting, according to an article in Inc. magazine.

If organizations are to avoid high turnover costs in a world where the need for new skills is shifting constantly, they must give employees more time for learning and make training courses more relevant to the future needs of organizations and individuals, not just to their current needs.

The amount of learning required will vary by role. That’s why at SAP we’re creating learning personas for specific roles in the company and determining how many hours will be required for each. We’re also dividing up training hours into distinct topics:

  • Law: 10%. This is training required by law, such as training to prevent sexual harassment in the workplace.

  • Company: 20%. Company training includes internal policies and systems.

  • Business: 30%. Employees learn skills required for their current roles in their business units.

  • Future: 40%. This is internal, external, and employee-driven training to close critical skill gaps for jobs of the future.

In the future, we will always need to learn, grow, read, seek out knowledge and truth, and better ourselves with new skills. With the support of employers and educators, we will transform our hardwired fear of change into excitement for change.

We must be able to say to ourselves, “I’m excited to learn something new that I never thought I could do or that never seemed possible before.” D!

Comments