Sections

The Role Of Imagination In Creating The Next Set Of Breakthrough Innovations

Mukesh Gupta

I stumbled across “As We May Think,” an article that Director of the Office of Scientific Research and Development Vannevar Bush wrote in 1945.

This article validates for me the role that imagination can play in the innovation process. Dr. Bush was able to predict in part what the world would look and feel like many years into the future. Here’s what I learned from his approach to ideation and how we can use it to assist in our own quest for using imagination to come up with innovations of the future.

Before trying to predict the future, understand the present

First, Bush thoroughly analyzed the present day (mid-1940s) situation, including what WWII had fostered and what it hindered, from the perspective of scientific inquiry and progress. He shared his thoughts on the state of scientific research and where science had seen progress and where it stood still.

Identify potentialities by extrapolation

He then extrapolated into the future by identifying the potentialities in the progress made, and shared what he though would happen in the near-to-short term if things had continued on the same trajectory. This is where he talked about immediate and imminent progress based on what was already happening. Most futurists and trend predictors use this process to forecast their trends.

Now, let your imagination fly

Once he built a good, solid foundation by identifying the progress made and what was expected in the near-to-short term, he then allowed his imagination to take flight. He talked about the camera becoming so small that someone would could carry one strapped onto their foreheads (sounds to me like a GoPro):

The camera hound of the future wears on his forehead a lump a little larger than a walnut.

He then explored and explained what the film and printing process would look like:

Often it would be advantageous to be able to snap the camera and to look at the picture immediately.

He imagined advances in micro-film technology that would enable the whole of Encyclopedia Britannica (one of the largest book collections at that time) to be available on something the size of a matchbox.

The Encyclopedia Britannica could be reduced to the volume of a matchbox. A library of a million volumes could be compressed into one end of a desk. If the human race has produced since the invention of movable type a total record, in the form of magazines, newspapers, books, tracts, advertising blurbs, correspondence, having a volume corresponding to a billion books, the whole affair, assembled and compressed, could be lugged off in a moving van.

The material for the microfilm Britannica would cost a nickel, and it could be mailed anywhere for a cent.

In addition to storing all of this knowledge in a small size, he said it is also important to create new knowledge and do so in an easy and simple way. He talked about a device into which someone speaks (in a specific way) and the device converts this into the appropriate text (sounds a lot like voice-to-text devices – Siri?)

To make the record, we now push a pencil or tap a typewriter. Then comes the process of digestion and correction, followed by an intricate process of typesetting, printing, and distribution. To consider the first stage of the procedure, will the author of the future cease writing by hand or typewriter and talk directly to the record? He does so indirectly, by talking to a stenographer or a wax cylinder; but the elements are all present if he wishes to have his talk directly produce a typed record. All he needs to do is to take advantage of existing mechanisms and to alter his language.

He then took flight in his imagination to put all of this together and predict what it would feel like to live in an era with such devices:

One can now picture a future investigator in his laboratory. His hands are free, and he is not anchored. As he moves about and observes, he photographs and comments. Time is automatically recorded to tie the two records together. If he goes into the field, he may be connected by radio to his recorder. As he ponders over his notes in the evening, he again talks his comments into the record. His typed record, as well as his photographs, may both be in miniature, so that he projects them for examination.

He acknowledged that a lot would need to happen between 1945’s reality and his imagined reality, but he was confident that it was all possible. He showed how past progress implied that the pace of innovation and creativity would only accelerate, meaning his imagined reality would not be very far from the time he was writing the piece.

Next he described mathematical inquiry and his definition of a mathematician:

A mathematician is not a man who can readily manipulate figures; often he cannot. He is not even a man who can readily perform the transformations of equations by the use of calculus. He is primarily an individual who is skilled in the use of symbolic logic on a high plane, and especially he is a man of intuitive judgment in the choice of the manipulative processes he employs.

This is probably the closest definition that I have come across for a data scientist. Bush said machines would do actual mathematical calculations and enable the mathematician to think about a higher order of logic. He also understood that the potential of such a machine is not limited to the scientist.

The scientist, however, is not the only person who manipulates data and examines the world about him by the use of logical processes, although he sometimes preserves this appearance by adopting into the fold anyone who becomes logical, much in the manner in which a British labor leader is elevated to knighthood. Whenever logical processes of thought are employed – that is, whenever thought for a time runs along an accepted groove – there is an opportunity for the machine. Formal logic used to be a keen instrument in the hands of the teacher in his trying of students’ souls. It is readily possible to construct a machine which will manipulate premises in accordance with formal logic, simply by the clever use of relay circuits. Put a set of premises into such a device and turn the crank, and it will readily pass out conclusion after conclusion, all in accordance with logical law, and with no more slips than would be expected of a keyboard adding machine.

I think this sounds like a general purpose computer or even a smartphone. He then goes on to imagine how a retail store could be run if all these innovations became a reality. It sounds a lot like an ERP system running the entire store and its operations.

He also predicted that machines can be taught to learn and operate, not just on selection by indexing, but by association, and that machines would be able to beat humans (the story of IBM’s Watson winning Jeopardy?) – what’s called today machine learning.

Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve, for his records have relative permanency. The first idea, however, to be drawn from the analogy concerns selection. Selection by association, rather than indexing, may yet be mechanized. One cannot hope thus to equal the speed and flexibility with which the mind follows an associative trail, but it should be possible to beat the mind decisively in regard to the permanence and clarity of the items resurrected from storage.

He described a personal machine (he calls it “memex”) that stores all the information and data that we need as individuals (including all the knowledge that humans have accumulated over the centuries) and is available whenever a person wants it. Information would be accessible by associative indexing (sounds like hyperlinking to me), which would allow us to move across connected and relevant topics.

The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English longbow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. When it becomes evident that the elastic properties of available materials had a great deal to do with the bow, he branches off on a side trail which takes him through textbooks on elasticity and tables of physical constants. He inserts a page of longhand analysis of his own. Thus he builds a trail of his interest through the maze of materials available to him.

And his trails do not fade. Several years later, his talk with a friend turns to the queer ways in which a people resist innovations, even of vital interest. He has an example, in the fact that the outraged Europeans still failed to adopt the Turkish bow. In fact he has a trail on it. A touch brings up the code book. Tapping a few keys projects the head of the trail. A lever runs through it at will, stopping at interesting items, going off on side excursions. It is an interesting trail, pertinent to the discussion. So he sets a reproducer in action, photographs the whole trail out, and passes it to his friend for insertion in his own memex, there to be linked into the more general trail.

Sounds a lot like a combination of Google, Wikipedia, and Evernote to me.

He then goes on to talk about the fact that science is a tool that could create weapons and innovations that could not only enable humanity to keep track of its history, but create a completely new future as well.

Applied imagination

In a single 1945 article, Vannevar Bush imagined so many innovations that we enjoy today, seven decades later. He imagined things similar to GoPro, selfie sticks, Google Glass, ERP systems, digitized Encyclopedia Britannica, search engines, note-taking in the cloud, voice-to-text and text-to-voice conversions, personal computers, mobile phones, and much more.

This shows that if we start from the place where we are today, apply our imagination, and take leaps of faiths, we can imagine what the future will look like and then go after this future with all our current strengths.

This ability to imagine is critical for all of us who wish to be part of the generation of innovators who will define what and how our future shapes up.

How to develop this ability to imagine

In “The Real Neuroscience of Creativity,” Scott Barry Kaufman talks about three kinds of neural networks – the Executive Attention Network (activated when we need focused attention to do something specific), the Imagination Network (also called the default network), and the Salience Network (acts as the “switching” network and decides which neural network needs to be activated when).

… the Default Network (referred to here as the Imagination Network) is involved in “constructing dynamic mental simulations based on personal past experiences such as used during remembering, thinking about the future, and generally when imagining alternative perspectives and scenarios to the present.” The Imagination Network is also involved in social cognition. For instance, when we are imagining what someone else is thinking, this brain network is active. The Imagination Network involves areas deep inside the prefrontal cortex and temporal lobe (medial regions), along with communication with various outer and inner regions of the parietal cortex.

Conclusion

What this tells me is that the ability to imagine is inherently human and we are all capable of letting our imagination soar, if we want to.

So, the inability to imagine new or alternate realities is totally self-induced – and sometimes induced by our systems (e.g., education and even the culture of our organizations). This also means that it is in our very hands to set this right and start imagining alternate realities. The more we practice, the better we will get at it.

The more important it is for us to innovate and create, the more critical the skill to imagine alternate realities.

When Vannevar wrote this piece, it was a time where technological breakthroughs were imminent.

We are again at the same crossroads & technological breakthroughs are imminent. The question we need to now ask is:

Will we bring in the breakthroughs, or will we stand and wait for someone to do it for us?

PS: You can view a visual tour of Vannevar Bush’s Work below:

More predictions: AI will make customers and employees happier – as long as it learns to respect our boundaries. Learn more about Empathy: The Killer App for Artificial Intelligence.

Comments

Mukesh Gupta

About Mukesh Gupta

Mukesh Gupta previously held the role of Executive Liaison for the SAP User group in India. He worked as the bridge between the User group and SAP (Development, Consulting, Sales and product management).

How To Design Your Company’s Digital Transformation

Sam Yen

The September issue of the Harvard Business Review features a cover story on design thinking’s coming of age. We have been applying design thinking within SAP for the past 10 years, and I’ve witnessed the growth of this human-centered approach to innovation first hand.

Design thinking is, as the HBR piece points out, “the best tool we have for … developing a responsive, flexible organizational culture.”

This means businesses are doing more to learn about their customers by interacting directly with them. We’re seeing this change in our work on d.forum — a community of design thinking champions and “disruptors” from across industries.

Meanwhile, technology is making it possible to know exponentially more about a customer. Businesses can now make increasingly accurate predictions about customers’ needs well into the future. The businesses best able to access and pull insights from this growing volume of data will win. That requires a fundamental change for our own industry; it necessitates a digital transformation.

So, how do we design this digital transformation?

It starts with the customer and an application of design thinking throughout an organization – blending business, technology and human values to generate innovation. Business is already incorporating design thinking, as the HBR cover story shows. We in technology need to do the same.

SCN SY.png

Design thinking plays an important role because it helps articulate what the end customer’s experience is going to be like. It helps focus all aspects of the business on understanding and articulating that future experience.

Once an organization is able to do that, the insights from that consumer experience need to be drawn down into the business, with the central question becoming: What does this future customer experience mean for us as an organization? What barriers do we need to remove? Do we need to organize ourselves differently? Does our process need to change – if it does, how? What kind of new technology do we need?

Then an organization must look carefully at roles within itself. What does this knowledge of the end customer’s future experience mean for an individual in human resources, for example, or finance? Those roles can then be viewed as end experiences unto themselves, with organizations applying design thinking to learn about the needs inherent to those roles. They can then change roles to better meet the end customer’s future needs. This end customer-centered approach is what drives change.

This also means design thinking is more important than ever for IT organizations.

We, in the IT industry, have been charged with being responsive to business, using technology to solve the problems business presents. Unfortunately, business sometimes views IT as the organization keeping the lights on. If we make the analogy of a store: business is responsible for the front office, focused on growing the business where consumers directly interact with products and marketing; while the perception is that IT focuses on the back office, keeping servers running and the distribution system humming. The key is to have business and IT align to meet the needs of the front office together.

Remember what I said about the growing availability of consumer data? The business best able to access and learn from that data will win. Those of us in IT organizations have the technology to make that win possible, but the way we are seen and our very nature needs to change if we want to remain relevant to business and participate in crafting the winning strategy.

We need to become more front office and less back office, proving to business that we are innovation partners in technology.

This means, in order to communicate with businesses today, we need to take a design thinking approach. We in IT need to show we have an understanding of the end consumer’s needs and experience, and we must align that knowledge and understanding with technological solutions. When this works — when the front office and back office come together in this way — it can lead to solutions that a company could otherwise never have realized.

There’s different qualities, of course, between front office and back office requirements. The back office is the foundation of a company and requires robustness, stability, and reliability. The front office, on the other hand, moves much more quickly. It is always changing with new product offerings and marketing campaigns. Technology must also show agility, flexibility, and speed. The business needs both functions to survive. This is a challenge for IT organizations, but it is not an impossible shift for us to make.

Here’s the breakdown of our challenge.

1. We need to better understand the real needs of the business.

This means learning more about the experience and needs of the end customer and then translating that information into technological solutions.

2. We need to be involved in more of the strategic discussions of the business.

Use the regular invitations to meetings with business as an opportunity to surface the deeper learning about the end consumer and the technology solutions that business may otherwise not know to ask for or how to implement.

The IT industry overall may not have a track record of operating in this way, but if we are not involved in the strategic direction of companies and shedding light on the future path, we risk not being considered innovation partners for the business.

We must collaborate with business, understand the strategic direction and highlight the technical challenges and opportunities. When we do, IT will become a hybrid organization – able to maintain the back office while capitalizing on the front office’s growing technical needs. We will highlight solutions that business could otherwise have missed, ushering in a digital transformation.

Digital transformation goes beyond just technology; it requires a mindset. See What It Really Means To Be A Digital Organization.

This story originally appeared on SAP Business Trends.

Top image via Shutterstock

Comments

Sam Yen

About Sam Yen

Sam Yen is the Chief Design Officer for SAP and the Managing Director of SAP Labs Silicon Valley. He is focused on driving a renewed commitment to design and user experience at SAP. Under his leadership, SAP further strengthens its mission of listening to customers´ needs leading to tangible results, including SAP Fiori, SAP Screen Personas and SAP´s UX design services.

How Productive Could You Be With 45 Minutes More Per Day?

Michael Rander

Chances are that you are already feeling your fair share of organizational complexity when navigating your current company, but have you ever considered just how much time is spent across all companies on managing complexity? According to a recent study by the Economist Intelligence Unit (EIU), the global impact of complexity is mind-blowing – and not in a good way.

The study revealed that 38% of respondents spent 16%-25% of their time just dealing with organizational complexity, and 17% spent a staggering 26%-50% of their time doing so. To put that into more concrete numbers, in the US alone, if executives could cut their time spent managing complexity in half, an estimated 8.6 million hours could be saved a week. That corresponds to 45 minutes per executive per day.

The potential productivity impact of every executive having 45 minutes more to work every single day is clearly significant, and considering that 55% say that their organization is either very or extremely complex, why are we then not making the reduction of complexity one or our top of mind issues?

The problem is that identifying the sources of complexity is complex in of itself. Key sources of complexity include organizational size, executive priorities, pace of innovation, decision-making processes, vastly increasing amounts of data to manage, organizational structures, and the pure culture of the company. As a consequence, answers are not universal by any means.

That being said, the negative productivity impact of complexity, regardless of the specific source, is felt similarly across a very large segment of the respondents, with 55% stating that complexity has taken a direct toll on profitability over the past three years.  This is such a serious problem that 8% of respondents actually slowed down their company growth in order to deal with complexity.

So, if complexity oftentimes impacts productivity and subsequently profitability, what are some of the more successful initiatives that companies are taking to combat these effects? Among the answers from the EIU survey, the following were highlighted among the most likely initiatives to reduce complexity and ultimately increase productivity:

  • Making it a company-wide goal to reduce complexity means that the executive level has to live and breathe simplification in order for the rest of the organization to get behind it. Changing behaviors across the organization requires strong leadership, commitment, and change management, and these initiatives ultimately lead to improved decision-making processes, which was reported by respondents as the top benefit of reducing complexity. From a leadership perspective this also requires setting appropriate metrics for measuring outcomes, and for metrics, productivity and efficiency were by far the most popular choices amongst respondents though strangely collaboration related metrics where not ranking high in spite of collaboration being a high level priority.
  • Promoting a culture of collaboration means enabling employees and management alike to collaborate not only within their teams but also across the organization, with partners, and with customers. Creating cross-functional roles to facilitate collaboration was cited by 56% as the most helpful strategy in achieving this goal.
  • More than half (54%) of respondents found the implementation of new technology and tools to be a successful step towards reducing complexity and improving productivity. Enabling collaboration, reducing information overload, building scenarios and prognoses, and enabling real-time decision-making are all key issues that technology can help to reduce complexity at all levels of the organization.

While these initiatives won’t help everyone, it is interesting to see that more than half of companies believe that if they could cut complexity in half they could be at least 11%-25% more productive. That nearly one in five respondents indicated that they could be 26%-50% more productive is a massive improvement.

The question then becomes whether we can make complexity and its impact on productivity not only more visible as a key issue for companies to address, but (even more importantly) also something that every company and every employee should be actively working to reduce. The potential productivity gains listed by respondents certainly provide food for thought, and few other corporate activities are likely to gain that level of ROI.

Just imagine having 45 minutes each and every day for actively pursuing new projects, getting innovative, collaborating, mentoring, learning, reducing stress, etc. What would you do? The vision is certainly compelling, and the question is are we as companies, leaders, and employees going to do something about it?

To read more about the EIU study, please see:

Feel free to follow me on Twitter: @michaelrander

Comments

Michael Rander

About Michael Rander

Michael Rander is the Global Research Director for Future Of Work at SAP. He is an experienced project manager, strategic and competitive market researcher, operations manager as well as an avid photographer, athlete, traveler and entrepreneur. Share your thoughts with Michael on Twitter @michaelrander.

How Emotionally Aware Computing Can Bring Happiness to Your Organization

Christopher Koch


Do you feel me?

Just as once-novel voice recognition technology is now a ubiquitous part of human–machine relationships, so too could mood recognition technology (aka “affective computing”) soon pervade digital interactions.

Through the application of machine learning, Big Data inputs, image recognition, sensors, and in some cases robotics, artificially intelligent systems hunt for affective clues: widened eyes, quickened speech, and crossed arms, as well as heart rate or skin changes.




Emotions are big business

The global affective computing market is estimated to grow from just over US$9.3 billion a year in 2015 to more than $42.5 billion by 2020.

Source: “Affective Computing Market 2015 – Technology, Software, Hardware, Vertical, & Regional Forecasts to 2020 for the $42 Billion Industry” (Research and Markets, 2015)

Customer experience is the sweet spot

Forrester found that emotion was the number-one factor in determining customer loyalty in 17 out of the 18 industries it surveyed – far more important than the ease or effectiveness of customers’ interactions with a company.


Source: “You Can’t Afford to Overlook Your Customers’ Emotional Experience” (Forrester, 2015)


Humana gets an emotional clue

Source: “Artificial Intelligence Helps Humana Avoid Call Center Meltdowns” (The Wall Street Journal, October 27, 2016)

Insurer Humana uses artificial intelligence software that can detect conversational cues to guide call-center workers through difficult customer calls. The system recognizes that a steady rise in the pitch of a customer’s voice or instances of agent and customer talking over one another are causes for concern.

The system has led to hard results: Humana says it has seen an 28% improvement in customer satisfaction, a 63% improvement in agent engagement, and a 6% improvement in first-contact resolution.


Spread happiness across the organization

Source: “Happiness and Productivity” (University of Warwick, February 10, 2014)

Employers could monitor employee moods to make organizational adjustments that increase productivity, effectiveness, and satisfaction. Happy employees are around 12% more productive.




Walking on emotional eggshells

Whether customers and employees will be comfortable having their emotions logged and broadcast by companies is an open question. Customers may find some uses of affective computing creepy or, worse, predatory. Be sure to get their permission.


Other limiting factors

The availability of the data required to infer a person’s emotional state is still limited. Further, it can be difficult to capture all the physical cues that may be relevant to an interaction, such as facial expression, tone of voice, or posture.



Get a head start


Discover the data

Companies should determine what inferences about mental states they want the system to make and how accurately those inferences can be made using the inputs available.


Work with IT

Involve IT and engineering groups to figure out the challenges of integrating with existing systems for collecting, assimilating, and analyzing large volumes of emotional data.


Consider the complexity

Some emotions may be more difficult to discern or respond to. Context is also key. An emotionally aware machine would need to respond differently to frustration in a user in an educational setting than to frustration in a user in a vehicle.

 


 

download arrowTo learn more about how affective computing can help your organization, read the feature story Empathy: The Killer App for Artificial Intelligence.


Comments

Christopher Koch

About Christopher Koch

Christopher Koch is the Editorial Director of the SAP Center for Business Insight. He is an experienced publishing professional, researcher, editor, and writer in business, technology, and B2B marketing. Share your thoughts with Chris on Twitter @Ckochster.

Tags:

In An Agile Environment, Revenue Models Are Flexible Too

Todd Wasserman

In 2012, Dollar Shave Club burst on the scene with a cheeky viral video that won praise for its creativity and marketing acumen. Less heralded at the time was the startup’s pricing model, which swapped traditional retail for subscriptions.

For as low as $1 a month (for five two-bladed cartridges), consumers got a package in the mail that saved them a trip to the pharmacy or grocery store. Dollar Shave Club received the ultimate vindication for the idea in 2016 when Unilever purchased the company for $1 billion.

As that example shows, new technology creates the possibility for new pricing models that can disrupt existing industries. The same phenomenon has occurred in software, in which the cloud and Web-based interfaces have ushered in Software as a Service (SaaS), which charges users on a monthly basis, like a utility, instead of the typical purchase-and-later-upgrade model.

Pricing, in other words, is a variable that can be used to disrupt industries. Other options include usage-based pricing and freemium.

Products as services, services as products

There are basically two ways that businesses can use pricing to disrupt the status quo: Turn products into services and turn services into products. Dollar Shave Club and SaaS are two examples of turning products into services.

Others include Amazon’s Dash, a bare-bones Internet of Things device that lets consumers reorder items ranging from Campbell’s Soup to Play-Doh. Another example is Rent the Runway, which rents high-end fashion items for a weekend rather than selling the items. Trunk Club offers a twist on this by sending items picked out by a stylist to users every month. Users pay for what they want and send back the rest.

The other option is productizing a service. Restaurant franchising is based on this model. While the restaurant offers food service to consumers, for entrepreneurs the franchise offers guidance and brand equity that can be condensed into a product format. For instance, a global HR firm called Littler has productized its offerings with Littler CaseSmart-Charges, which is designed for in-house attorneys and features software, project management tools, and access to flextime attorneys.

As that example shows, technology offers opportunities to try new revenue models. Another example is APIs, which have become a large source of revenue for companies. The monetization of APIs is often viewed as a side business that encompasses a wholly different pricing model that’s often engineered to create huge user bases with volume discounts.

Not a new idea

Though technology has opened up new vistas for businesses seeking alternate pricing models, Rajkumar Venkatesan, a marketing professor at University of Virginia’s Darden School of Business, points out that this isn’t necessarily a new idea. For instance, King Gillette made his fortune in the early part of the 20th Century by realizing that a cheap shaving device would pave the way for a recurring revenue stream via replacement razor blades.

“The new variation was the Keurig,” said Venkatesan, referring to the coffee machine that relies on replaceable cartridges. “It has started becoming more prevalent in the last 10 years, but the fundamental model has been there.” For businesses, this can be an attractive model not only for the recurring revenue but also for the ability to cross-sell new goods to existing customers, Venkatesan said.

Another benefit to a subscription model is that it can also supply first-party data that companies can use to better understand and market to their customers. Some believe that Dollar Shave Club’s close relationship with its young male user base was one reason for Unilever’s purchase, for instance. In such a cut-throat market, such relationships can fetch a high price.

To learn more about how you can monetize disruption, watch this video overview of the new SAP Hybris Revenue Cloud.

Comments