Sections

The Role Of Imagination In Creating The Next Set Of Breakthrough Innovations

Mukesh Gupta

I stumbled across “As We May Think,” an article that Director of the Office of Scientific Research and Development Vannevar Bush wrote in 1945.

This article validates for me the role that imagination can play in the innovation process. Dr. Bush was able to predict in part what the world would look and feel like many years into the future. Here’s what I learned from his approach to ideation and how we can use it to assist in our own quest for using imagination to come up with innovations of the future.

Before trying to predict the future, understand the present

First, Bush thoroughly analyzed the present day (mid-1940s) situation, including what WWII had fostered and what it hindered, from the perspective of scientific inquiry and progress. He shared his thoughts on the state of scientific research and where science had seen progress and where it stood still.

Identify potentialities by extrapolation

He then extrapolated into the future by identifying the potentialities in the progress made, and shared what he though would happen in the near-to-short term if things had continued on the same trajectory. This is where he talked about immediate and imminent progress based on what was already happening. Most futurists and trend predictors use this process to forecast their trends.

Now, let your imagination fly

Once he built a good, solid foundation by identifying the progress made and what was expected in the near-to-short term, he then allowed his imagination to take flight. He talked about the camera becoming so small that someone would could carry one strapped onto their foreheads (sounds to me like a GoPro):

The camera hound of the future wears on his forehead a lump a little larger than a walnut.

He then explored and explained what the film and printing process would look like:

Often it would be advantageous to be able to snap the camera and to look at the picture immediately.

He imagined advances in micro-film technology that would enable the whole of Encyclopedia Britannica (one of the largest book collections at that time) to be available on something the size of a matchbox.

The Encyclopedia Britannica could be reduced to the volume of a matchbox. A library of a million volumes could be compressed into one end of a desk. If the human race has produced since the invention of movable type a total record, in the form of magazines, newspapers, books, tracts, advertising blurbs, correspondence, having a volume corresponding to a billion books, the whole affair, assembled and compressed, could be lugged off in a moving van.

The material for the microfilm Britannica would cost a nickel, and it could be mailed anywhere for a cent.

In addition to storing all of this knowledge in a small size, he said it is also important to create new knowledge and do so in an easy and simple way. He talked about a device into which someone speaks (in a specific way) and the device converts this into the appropriate text (sounds a lot like voice-to-text devices – Siri?)

To make the record, we now push a pencil or tap a typewriter. Then comes the process of digestion and correction, followed by an intricate process of typesetting, printing, and distribution. To consider the first stage of the procedure, will the author of the future cease writing by hand or typewriter and talk directly to the record? He does so indirectly, by talking to a stenographer or a wax cylinder; but the elements are all present if he wishes to have his talk directly produce a typed record. All he needs to do is to take advantage of existing mechanisms and to alter his language.

He then took flight in his imagination to put all of this together and predict what it would feel like to live in an era with such devices:

One can now picture a future investigator in his laboratory. His hands are free, and he is not anchored. As he moves about and observes, he photographs and comments. Time is automatically recorded to tie the two records together. If he goes into the field, he may be connected by radio to his recorder. As he ponders over his notes in the evening, he again talks his comments into the record. His typed record, as well as his photographs, may both be in miniature, so that he projects them for examination.

He acknowledged that a lot would need to happen between 1945’s reality and his imagined reality, but he was confident that it was all possible. He showed how past progress implied that the pace of innovation and creativity would only accelerate, meaning his imagined reality would not be very far from the time he was writing the piece.

Next he described mathematical inquiry and his definition of a mathematician:

A mathematician is not a man who can readily manipulate figures; often he cannot. He is not even a man who can readily perform the transformations of equations by the use of calculus. He is primarily an individual who is skilled in the use of symbolic logic on a high plane, and especially he is a man of intuitive judgment in the choice of the manipulative processes he employs.

This is probably the closest definition that I have come across for a data scientist. Bush said machines would do actual mathematical calculations and enable the mathematician to think about a higher order of logic. He also understood that the potential of such a machine is not limited to the scientist.

The scientist, however, is not the only person who manipulates data and examines the world about him by the use of logical processes, although he sometimes preserves this appearance by adopting into the fold anyone who becomes logical, much in the manner in which a British labor leader is elevated to knighthood. Whenever logical processes of thought are employed – that is, whenever thought for a time runs along an accepted groove – there is an opportunity for the machine. Formal logic used to be a keen instrument in the hands of the teacher in his trying of students’ souls. It is readily possible to construct a machine which will manipulate premises in accordance with formal logic, simply by the clever use of relay circuits. Put a set of premises into such a device and turn the crank, and it will readily pass out conclusion after conclusion, all in accordance with logical law, and with no more slips than would be expected of a keyboard adding machine.

I think this sounds like a general purpose computer or even a smartphone. He then goes on to imagine how a retail store could be run if all these innovations became a reality. It sounds a lot like an ERP system running the entire store and its operations.

He also predicted that machines can be taught to learn and operate, not just on selection by indexing, but by association, and that machines would be able to beat humans (the story of IBM’s Watson winning Jeopardy?) – what’s called today machine learning.

Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve, for his records have relative permanency. The first idea, however, to be drawn from the analogy concerns selection. Selection by association, rather than indexing, may yet be mechanized. One cannot hope thus to equal the speed and flexibility with which the mind follows an associative trail, but it should be possible to beat the mind decisively in regard to the permanence and clarity of the items resurrected from storage.

He described a personal machine (he calls it “memex”) that stores all the information and data that we need as individuals (including all the knowledge that humans have accumulated over the centuries) and is available whenever a person wants it. Information would be accessible by associative indexing (sounds like hyperlinking to me), which would allow us to move across connected and relevant topics.

The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English longbow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. When it becomes evident that the elastic properties of available materials had a great deal to do with the bow, he branches off on a side trail which takes him through textbooks on elasticity and tables of physical constants. He inserts a page of longhand analysis of his own. Thus he builds a trail of his interest through the maze of materials available to him.

And his trails do not fade. Several years later, his talk with a friend turns to the queer ways in which a people resist innovations, even of vital interest. He has an example, in the fact that the outraged Europeans still failed to adopt the Turkish bow. In fact he has a trail on it. A touch brings up the code book. Tapping a few keys projects the head of the trail. A lever runs through it at will, stopping at interesting items, going off on side excursions. It is an interesting trail, pertinent to the discussion. So he sets a reproducer in action, photographs the whole trail out, and passes it to his friend for insertion in his own memex, there to be linked into the more general trail.

Sounds a lot like a combination of Google, Wikipedia, and Evernote to me.

He then goes on to talk about the fact that science is a tool that could create weapons and innovations that could not only enable humanity to keep track of its history, but create a completely new future as well.

Applied imagination

In a single 1945 article, Vannevar Bush imagined so many innovations that we enjoy today, seven decades later. He imagined things similar to GoPro, selfie sticks, Google Glass, ERP systems, digitized Encyclopedia Britannica, search engines, note-taking in the cloud, voice-to-text and text-to-voice conversions, personal computers, mobile phones, and much more.

This shows that if we start from the place where we are today, apply our imagination, and take leaps of faiths, we can imagine what the future will look like and then go after this future with all our current strengths.

This ability to imagine is critical for all of us who wish to be part of the generation of innovators who will define what and how our future shapes up.

How to develop this ability to imagine

In “The Real Neuroscience of Creativity,” Scott Barry Kaufman talks about three kinds of neural networks – the Executive Attention Network (activated when we need focused attention to do something specific), the Imagination Network (also called the default network), and the Salience Network (acts as the “switching” network and decides which neural network needs to be activated when).

… the Default Network (referred to here as the Imagination Network) is involved in “constructing dynamic mental simulations based on personal past experiences such as used during remembering, thinking about the future, and generally when imagining alternative perspectives and scenarios to the present.” The Imagination Network is also involved in social cognition. For instance, when we are imagining what someone else is thinking, this brain network is active. The Imagination Network involves areas deep inside the prefrontal cortex and temporal lobe (medial regions), along with communication with various outer and inner regions of the parietal cortex.

Conclusion

What this tells me is that the ability to imagine is inherently human and we are all capable of letting our imagination soar, if we want to.

So, the inability to imagine new or alternate realities is totally self-induced – and sometimes induced by our systems (e.g., education and even the culture of our organizations). This also means that it is in our very hands to set this right and start imagining alternate realities. The more we practice, the better we will get at it.

The more important it is for us to innovate and create, the more critical the skill to imagine alternate realities.

When Vannevar wrote this piece, it was a time where technological breakthroughs were imminent.

We are again at the same crossroads & technological breakthroughs are imminent. The question we need to now ask is:

Will we bring in the breakthroughs, or will we stand and wait for someone to do it for us?

PS: You can view a visual tour of Vannevar Bush’s Work below:

More predictions: AI will make customers and employees happier – as long as it learns to respect our boundaries. Learn more about Empathy: The Killer App for Artificial Intelligence.

Comments

Mukesh Gupta

About Mukesh Gupta

Mukesh Gupta previously held the role of Executive Liaison for the SAP User group in India. He worked as the bridge between the User group and SAP (Development, Consulting, Sales and product management).

How To Design Your Company’s Digital Transformation

Sam Yen

The September issue of the Harvard Business Review features a cover story on design thinking’s coming of age. We have been applying design thinking within SAP for the past 10 years, and I’ve witnessed the growth of this human-centered approach to innovation first hand.

Design thinking is, as the HBR piece points out, “the best tool we have for … developing a responsive, flexible organizational culture.”

This means businesses are doing more to learn about their customers by interacting directly with them. We’re seeing this change in our work on d.forum — a community of design thinking champions and “disruptors” from across industries.

Meanwhile, technology is making it possible to know exponentially more about a customer. Businesses can now make increasingly accurate predictions about customers’ needs well into the future. The businesses best able to access and pull insights from this growing volume of data will win. That requires a fundamental change for our own industry; it necessitates a digital transformation.

So, how do we design this digital transformation?

It starts with the customer and an application of design thinking throughout an organization – blending business, technology and human values to generate innovation. Business is already incorporating design thinking, as the HBR cover story shows. We in technology need to do the same.

Design thinking plays an important role because it helps articulate what the end customer’s experience is going to be like. It helps focus all aspects of the business on understanding and articulating that future experience.

Once an organization is able to do that, the insights from that consumer experience need to be drawn down into the business, with the central question becoming: What does this future customer experience mean for us as an organization? What barriers do we need to remove? Do we need to organize ourselves differently? Does our process need to change – if it does, how? What kind of new technology do we need?

Then an organization must look carefully at roles within itself. What does this knowledge of the end customer’s future experience mean for an individual in human resources, for example, or finance? Those roles can then be viewed as end experiences unto themselves, with organizations applying design thinking to learn about the needs inherent to those roles. They can then change roles to better meet the end customer’s future needs. This end customer-centered approach is what drives change.

This also means design thinking is more important than ever for IT organizations.

We, in the IT industry, have been charged with being responsive to business, using technology to solve the problems business presents. Unfortunately, business sometimes views IT as the organization keeping the lights on. If we make the analogy of a store: business is responsible for the front office, focused on growing the business where consumers directly interact with products and marketing; while the perception is that IT focuses on the back office, keeping servers running and the distribution system humming. The key is to have business and IT align to meet the needs of the front office together.

Remember what I said about the growing availability of consumer data? The business best able to access and learn from that data will win. Those of us in IT organizations have the technology to make that win possible, but the way we are seen and our very nature needs to change if we want to remain relevant to business and participate in crafting the winning strategy.

We need to become more front office and less back office, proving to business that we are innovation partners in technology.

This means, in order to communicate with businesses today, we need to take a design thinking approach. We in IT need to show we have an understanding of the end consumer’s needs and experience, and we must align that knowledge and understanding with technological solutions. When this works — when the front office and back office come together in this way — it can lead to solutions that a company could otherwise never have realized.

There’s different qualities, of course, between front office and back office requirements. The back office is the foundation of a company and requires robustness, stability, and reliability. The front office, on the other hand, moves much more quickly. It is always changing with new product offerings and marketing campaigns. Technology must also show agility, flexibility, and speed. The business needs both functions to survive. This is a challenge for IT organizations, but it is not an impossible shift for us to make.

Here’s the breakdown of our challenge.

1. We need to better understand the real needs of the business.

This means learning more about the experience and needs of the end customer and then translating that information into technological solutions.

2. We need to be involved in more of the strategic discussions of the business.

Use the regular invitations to meetings with business as an opportunity to surface the deeper learning about the end consumer and the technology solutions that business may otherwise not know to ask for or how to implement.

The IT industry overall may not have a track record of operating in this way, but if we are not involved in the strategic direction of companies and shedding light on the future path, we risk not being considered innovation partners for the business.

We must collaborate with business, understand the strategic direction and highlight the technical challenges and opportunities. When we do, IT will become a hybrid organization – able to maintain the back office while capitalizing on the front office’s growing technical needs. We will highlight solutions that business could otherwise have missed, ushering in a digital transformation.

Digital transformation goes beyond just technology; it requires a mindset. See What It Really Means To Be A Digital Organization.

This story originally appeared on SAP Business Trends.

Top image via Shutterstock

Comments

Sam Yen

About Sam Yen

Sam Yen is the Chief Design Officer for SAP and the Managing Director of SAP Labs Silicon Valley. He is focused on driving a renewed commitment to design and user experience at SAP. Under his leadership, SAP further strengthens its mission of listening to customers´ needs leading to tangible results, including SAP Fiori, SAP Screen Personas and SAP´s UX design services.

How Productive Could You Be With 45 Minutes More Per Day?

Michael Rander

Chances are that you are already feeling your fair share of organizational complexity when navigating your current company, but have you ever considered just how much time is spent across all companies on managing complexity? According to a recent study by the Economist Intelligence Unit (EIU), the global impact of complexity is mind-blowing – and not in a good way.

The study revealed that 38% of respondents spent 16%-25% of their time just dealing with organizational complexity, and 17% spent a staggering 26%-50% of their time doing so. To put that into more concrete numbers, in the US alone, if executives could cut their time spent managing complexity in half, an estimated 8.6 million hours could be saved a week. That corresponds to 45 minutes per executive per day.

The potential productivity impact of every executive having 45 minutes more to work every single day is clearly significant, and considering that 55% say that their organization is either very or extremely complex, why are we then not making the reduction of complexity one or our top of mind issues?

The problem is that identifying the sources of complexity is complex in of itself. Key sources of complexity include organizational size, executive priorities, pace of innovation, decision-making processes, vastly increasing amounts of data to manage, organizational structures, and the pure culture of the company. As a consequence, answers are not universal by any means.

That being said, the negative productivity impact of complexity, regardless of the specific source, is felt similarly across a very large segment of the respondents, with 55% stating that complexity has taken a direct toll on profitability over the past three years.  This is such a serious problem that 8% of respondents actually slowed down their company growth in order to deal with complexity.

So, if complexity oftentimes impacts productivity and subsequently profitability, what are some of the more successful initiatives that companies are taking to combat these effects? Among the answers from the EIU survey, the following were highlighted among the most likely initiatives to reduce complexity and ultimately increase productivity:

  • Making it a company-wide goal to reduce complexity means that the executive level has to live and breathe simplification in order for the rest of the organization to get behind it. Changing behaviors across the organization requires strong leadership, commitment, and change management, and these initiatives ultimately lead to improved decision-making processes, which was reported by respondents as the top benefit of reducing complexity. From a leadership perspective this also requires setting appropriate metrics for measuring outcomes, and for metrics, productivity and efficiency were by far the most popular choices amongst respondents though strangely collaboration related metrics where not ranking high in spite of collaboration being a high level priority.
  • Promoting a culture of collaboration means enabling employees and management alike to collaborate not only within their teams but also across the organization, with partners, and with customers. Creating cross-functional roles to facilitate collaboration was cited by 56% as the most helpful strategy in achieving this goal.
  • More than half (54%) of respondents found the implementation of new technology and tools to be a successful step towards reducing complexity and improving productivity. Enabling collaboration, reducing information overload, building scenarios and prognoses, and enabling real-time decision-making are all key issues that technology can help to reduce complexity at all levels of the organization.

While these initiatives won’t help everyone, it is interesting to see that more than half of companies believe that if they could cut complexity in half they could be at least 11%-25% more productive. That nearly one in five respondents indicated that they could be 26%-50% more productive is a massive improvement.

The question then becomes whether we can make complexity and its impact on productivity not only more visible as a key issue for companies to address, but (even more importantly) also something that every company and every employee should be actively working to reduce. The potential productivity gains listed by respondents certainly provide food for thought, and few other corporate activities are likely to gain that level of ROI.

Just imagine having 45 minutes each and every day for actively pursuing new projects, getting innovative, collaborating, mentoring, learning, reducing stress, etc. What would you do? The vision is certainly compelling, and the question is are we as companies, leaders, and employees going to do something about it?

To read more about the EIU study, please see:

Feel free to follow me on Twitter: @michaelrander

Comments

Michael Rander

About Michael Rander

Michael Rander is the Global Research Director for Future Of Work at SAP. He is an experienced project manager, strategic and competitive market researcher, operations manager as well as an avid photographer, athlete, traveler and entrepreneur. Share your thoughts with Michael on Twitter @michaelrander.

The Future of Cybersecurity: Trust as Competitive Advantage

Justin Somaini and Dan Wellers

 

The cost of data breaches will reach US$2.1 trillion globally by 2019—nearly four times the cost in 2015.

Cyberattacks could cost up to $90 trillion in net global economic benefits by 2030 if cybersecurity doesn’t keep pace with growing threat levels.

Cyber insurance premiums could increase tenfold to $20 billion annually by 2025.

Cyberattacks are one of the top 10 global risks of highest concern for the next decade.


Companies are collaborating with a wider network of partners, embracing distributed systems, and meeting new demands for 24/7 operations.

But the bad guys are sharing intelligence, harnessing emerging technologies, and working round the clock as well—and companies are giving them plenty of weaknesses to exploit.

  • 33% of companies today are prepared to prevent a worst-case attack.
  • 25% treat cyber risk as a significant corporate risk.
  • 80% fail to assess their customers and suppliers for cyber risk.

The ROI of Zero Trust

Perimeter security will not be enough. As interconnectivity increases so will the adoption of zero-trust networks, which place controls around data assets and increases visibility into how they are used across the digital ecosystem.


A Layered Approach

Companies that embrace trust as a competitive advantage will build robust security on three core tenets:

  • Prevention: Evolving defensive strategies from security policies and educational approaches to access controls
  • Detection: Deploying effective systems for the timely detection and notification of intrusions
  • Reaction: Implementing incident response plans similar to those for other disaster recovery scenarios

They’ll build security into their digital ecosystems at three levels:

  1. Secure products. Security in all applications to protect data and transactions
  2. Secure operations. Hardened systems, patch management, security monitoring, end-to-end incident handling, and a comprehensive cloud-operations security framework
  3. Secure companies. A security-aware workforce, end-to-end physical security, and a thorough business continuity framework

Against Digital Armageddon

Experts warn that the worst-case scenario is a state of perpetual cybercrime and cyber warfare, vulnerable critical infrastructure, and trillions of dollars in losses. A collaborative approach will be critical to combatting this persistent global threat with implications not just for corporate and personal data but also strategy, supply chains, products, and physical operations.


Download the executive brief The Future of Cybersecurity: Trust as Competitive Advantage.


Comments

Tags:

To Get Past Blockchain Hype, We Must Think Differently

Susan Galer

Blockchain hype is reaching fever pitch, making it the perfect time to separate market noise from valid signals. As part of my ongoing conversations about blockchain, I reached out to several experts to find out where companies should consider going from here. Raimund Gross, Solution Architect and Futurist at SAP, acknowledged the challenges of understanding and applying such a complex leading-edge technology as blockchain.

“The people who really get it today are those able to put the hype in perspective with what’s realistically doable in the near future, and what’s unlikely to become a reality any time soon, if ever,” Gross said. “You need to commit the resources and find the right partners to lay the groundwork for success.”

Gross told me one of the biggest problems with blockchain – besides the unproven technology itself – was the mindset shift it demands. “Many people aren’t thinking about decentralized architectures with peer-to-peer networks and mash-ups, which is what blockchain is all about. People struggle because often discussions end up with a centralized approach based on past constructs. It will take training and experience to think decentrally.”

Here are several more perspectives on blockchain beyond the screaming headlines.

How blockchain disrupts insurance, banking

Blockchain has the potential to dramatically disrupt industries because the distributed ledger embeds automatic trust across processes. This changes the role of longstanding intermediaries like insurance companies and banks, essentially restructuring business models for entire industries.

“With the distributed ledger, all of the trusted intelligence related to insuring the risk resides in the cloud, providing everyone with access to the same information,” said Nadine Hoffmann, global solution manager for Innovation at SAP Financial Services. “Payment is automatically triggered when the agreed-upon risk scenario occurs. There are limitations given regulations, but blockchain can open up new services opportunities for established insurers, fintech startups, and even consumer-to-consumer offerings.”

Banks face a similar digitalized transformation. Long built on layers of steps to mitigate risk, blockchain offers the banking industry a network of built-in trust to improve efficiencies along with the customer experience in areas such as cross-border payments, trade settlements for assets, and other contractual and payment processes. What used to take days or even months could be completed in hours.

Finance departments evolve

Another group keenly watching blockchain developments are CFOs. Just as Uber and Airbnb have disrupted transportation and hospitality, blockchain has the potential to change not only the finance department — everything from audits and customs documentation to letters of credit and trade finance – but also the entire company.

“The distributed ledger’s capabilities can automate processes in shared service centers, allowing accountants and other employees in finance to speed up record keeping including proof of payment supporting investigations,” said Georg Koester, senior developer, LoB Finance at the Innovation Center Potsdam. “This lowers costs for the company and improves the customer experience.”

Koester said that embedding blockchain capabilities in software company-wide will also have a tremendous impact on product development, lean supply chain management, and other critical areas of the company.

While financial services dominate blockchain conversations right now, Gross named utilities, healthcare, public sector, real estate, and pretty much any industry as prime candidates for blockchain disruption. “Blockchain is specific to certain business scenarios in any industry,” said Gross. “Every organization can benefit from trust and transparency that mitigates risk and optimizes processes.”

Get started today! Run Live with SAP for Banking. Blast past the hype by attending the SAP Next-Gen Boot Camp on Blockchain in Financial Services and Public Sector event being held April 26-27 in Regensdorf, Switzerland.

Follow me on Twitter, SCN Business Trends, or Facebook. Read all of my Forbes articles here.

Comments