Sections

How To Get Better At Receiving Feedback

Jacob Shriar

Most companies are moving towards a more frequent feedback cycle, which requires a lot of effort from everyone. Most companies also spend an incredible amount of time and energy training managers on how to give feedback more effectively.

But this means nothing if the receiver of the feedback isn’t able to handle it.

It’s the feedback receiver who controls if the feedback is listened to. They need to understand what it means, and they’re responsible for making the change. Without them, none of this means anything.

Everyone knows that feedback is important to helping us grow, but many of us get defensive, angry, and insulted. It’s a natural reaction for us to fight back when we hear negative feedback.

Let’s look at why this is, and then suggest a few ways that you can get better at receiving feedback.

Do employees really want feedback?

This is a complex question with a complex answer.

The answer is yes and no. Research shows that employees crave feedback, but neuroscience shows that our brains perceive feedback as a social threat.

Yes – employees crave feedback
Research from Zenger/Folkman, a leadership development consultancy found that by roughly a 3:1 margin, people want corrective feedback more than praise, because it helps them improve their performance.

employees crave feedback

This graph highlights some of the key problems in organizations. Look at the difference between the second column and the fourth column.

Most managers don’t enjoy giving negative feedback (not very surprising), but employees love hearing it.

The secret is how the feedback is delivered. In their research, a full 92% of people agreed with the statement, “Negative feedback, if delivered appropriately, is effective at improving performance.” Those who rated their managers as highly effective at providing them with honest, straightforward feedback tended to score significantly higher on their preference for receiving corrective feedback.

This is a good lesson for managers.

Many people suggest doing what’s called a “feedback sandwich” when giving negative feedback, but it often doesn’t work. A much smarter approach is to be honest and straightforward. Your employees will appreciate it.

No – our brains don’t like feedback
There is a lot of neuroscience and psychology that makes receiving feedback so hard.

Our brains want to protect us, and receiving feedback is perceived by the brain in the same way as a physical threat such as being chased.

Because criticism can feel like an actual threat to our survival, it makes it much harder for us to hear. The problem with criticism is that it challenges our sense of value. Criticism implies judgment and we all recoil from feeling judged.  –Tony Schwartz

In the book Management Rewired: Why Feedback Doesn’t Work and Other Supervisory Lessons from Brain Science, author Charles Jacobs shows that when people receive information that is in conflict with their self-image, their tendency is to change the information rather than changing themselves.

As humans we also have a tendency to focus on the negative. This is known as the negativity bias. This is important for managers to understand, because feedback will always have a greater impact on employees than praise.

Tips for receiving feedback

It’s all in your head.

Of course, all of these tips are easier said than done, but you can work on making yourself better able to take the feedback. You want to try and become a little less sensitive to the feedback, and remember that it’s meant to help you.

Executive coach David Rock has developed the SCARF model to help identify things that are likely to trigger social threats. By understanding these, you can start to lower your sense of social threat.

Status: Status is our perception to how we compare to others. Feedback often comes from someone with a higher status than you, so you might respect it more than if a colleague was giving you feedback. With the colleague, it’s unclear at the moment if they’re assuming a higher status.

Certainty: Certainty is how certain we feel about the future. In terms of feedback, it’s hard to be certain of exactly what will be said to you. It’s okay though, prepare yourself mentally.

Autonomy: Autonomy is known that you have some control in what’s happening in the future. you don’t have much autonomy with feedback, you’re likely forced to respond to the feedback.

Relatedness: Relatedness is how similar or different are we to those around us. It’s important that we feel comfortable with the person giving us feedback. The more of a connection you have, the more receptive you’ll be.

Fairness: Fairness is our assessment of if we’re being treated fairly or not. In one survey1, 55% of employees said their most recent performance review had been unfair or inaccurate.

If you can understand these things and work on your mental capacity to handle and bounce back from feedback, you’ll get better at receiving feedback every day.

Here are a few other ideas you can use.

  1. Ask for feedback often

    They say practice makes perfect.

    If you can learn how to get into a mindset of continuous improvement and actively ask for feedback to improve, you’ll naturally get better at receiving feedback.

    Some sample questions could be:

    • What do you think is the #1 thing holding me back right now?
    • If you were in my position, what would you change about the way I work?
    • Is there anyone I should be spending more time with to learn or teach?
    • What do you think are my 3 biggest strengths and 3 biggest weaknesses?
  2. Listen carefully

    It’s natural when someone starts criticizing your work to explain yourself and justify anything you’ve done.

    Slow down, really take the time to listen to what the other person is saying, and reflect on it.

    Use feedback as an opportunity to grow, and think that maybe there’s a reason this person is saying this. Even if they’re wrong and it’s an issue of perception, it’s still an issue.

  3. Embrace failure

    This is much easier said than done, but in the last few years failure has become very accepted.

    This is more of an organizational thing, but you should try to cultivate a culture of acceptance towards failure, where it’s totally accepted and employees feel comfortable taking risks and trying new things.

  4. Focus on one thing at a time

    Stanford Professor Clifford Nass says that most people can take in only one critical comment at a time2, so it might make sense to collect feedback frequently and focus on one thing at each session.

    That will make it much easier for you to digest the feedback and come up with a plan for improving.

  5. Say thank you

    Never argue with anyone when they give you feedback, even if the feedback is wrong. If you become known as someone who constantly argues and gets defensive when given feedback, everyone will be less likely to give feedback to you in the future, which is a bad thing.

Comments

How To Design Your Company’s Digital Transformation

Sam Yen

The September issue of the Harvard Business Review features a cover story on design thinking’s coming of age. We have been applying design thinking within SAP for the past 10 years, and I’ve witnessed the growth of this human-centered approach to innovation first hand.

Design thinking is, as the HBR piece points out, “the best tool we have for … developing a responsive, flexible organizational culture.”

This means businesses are doing more to learn about their customers by interacting directly with them. We’re seeing this change in our work on d.forum — a community of design thinking champions and “disruptors” from across industries.

Meanwhile, technology is making it possible to know exponentially more about a customer. Businesses can now make increasingly accurate predictions about customers’ needs well into the future. The businesses best able to access and pull insights from this growing volume of data will win. That requires a fundamental change for our own industry; it necessitates a digital transformation.

So, how do we design this digital transformation?

It starts with the customer and an application of design thinking throughout an organization – blending business, technology and human values to generate innovation. Business is already incorporating design thinking, as the HBR cover story shows. We in technology need to do the same.

SCN SY.png

Design thinking plays an important role because it helps articulate what the end customer’s experience is going to be like. It helps focus all aspects of the business on understanding and articulating that future experience.

Once an organization is able to do that, the insights from that consumer experience need to be drawn down into the business, with the central question becoming: What does this future customer experience mean for us as an organization? What barriers do we need to remove? Do we need to organize ourselves differently? Does our process need to change – if it does, how? What kind of new technology do we need?

Then an organization must look carefully at roles within itself. What does this knowledge of the end customer’s future experience mean for an individual in human resources, for example, or finance? Those roles can then be viewed as end experiences unto themselves, with organizations applying design thinking to learn about the needs inherent to those roles. They can then change roles to better meet the end customer’s future needs. This end customer-centered approach is what drives change.

This also means design thinking is more important than ever for IT organizations.

We, in the IT industry, have been charged with being responsive to business, using technology to solve the problems business presents. Unfortunately, business sometimes views IT as the organization keeping the lights on. If we make the analogy of a store: business is responsible for the front office, focused on growing the business where consumers directly interact with products and marketing; while the perception is that IT focuses on the back office, keeping servers running and the distribution system humming. The key is to have business and IT align to meet the needs of the front office together.

Remember what I said about the growing availability of consumer data? The business best able to access and learn from that data will win. Those of us in IT organizations have the technology to make that win possible, but the way we are seen and our very nature needs to change if we want to remain relevant to business and participate in crafting the winning strategy.

We need to become more front office and less back office, proving to business that we are innovation partners in technology.

This means, in order to communicate with businesses today, we need to take a design thinking approach. We in IT need to show we have an understanding of the end consumer’s needs and experience, and we must align that knowledge and understanding with technological solutions. When this works — when the front office and back office come together in this way — it can lead to solutions that a company could otherwise never have realized.

There’s different qualities, of course, between front office and back office requirements. The back office is the foundation of a company and requires robustness, stability, and reliability. The front office, on the other hand, moves much more quickly. It is always changing with new product offerings and marketing campaigns. Technology must also show agility, flexibility, and speed. The business needs both functions to survive. This is a challenge for IT organizations, but it is not an impossible shift for us to make.

Here’s the breakdown of our challenge.

1. We need to better understand the real needs of the business.

This means learning more about the experience and needs of the end customer and then translating that information into technological solutions.

2. We need to be involved in more of the strategic discussions of the business.

Use the regular invitations to meetings with business as an opportunity to surface the deeper learning about the end consumer and the technology solutions that business may otherwise not know to ask for or how to implement.

The IT industry overall may not have a track record of operating in this way, but if we are not involved in the strategic direction of companies and shedding light on the future path, we risk not being considered innovation partners for the business.

We must collaborate with business, understand the strategic direction and highlight the technical challenges and opportunities. When we do, IT will become a hybrid organization – able to maintain the back office while capitalizing on the front office’s growing technical needs. We will highlight solutions that business could otherwise have missed, ushering in a digital transformation.

Digital transformation goes beyond just technology; it requires a mindset. See What It Really Means To Be A Digital Organization.

This story originally appeared on SAP Business Trends.

Top image via Shutterstock

Comments

Sam Yen

About Sam Yen

Sam Yen is the Chief Design Officer for SAP and the Managing Director of SAP Labs Silicon Valley. He is focused on driving a renewed commitment to design and user experience at SAP. Under his leadership, SAP further strengthens its mission of listening to customers´ needs leading to tangible results, including SAP Fiori, SAP Screen Personas and SAP´s UX design services.

How Productive Could You Be With 45 Minutes More Per Day?

Michael Rander

Chances are that you are already feeling your fair share of organizational complexity when navigating your current company, but have you ever considered just how much time is spent across all companies on managing complexity? According to a recent study by the Economist Intelligence Unit (EIU), the global impact of complexity is mind-blowing – and not in a good way.

The study revealed that 38% of respondents spent 16%-25% of their time just dealing with organizational complexity, and 17% spent a staggering 26%-50% of their time doing so. To put that into more concrete numbers, in the US alone, if executives could cut their time spent managing complexity in half, an estimated 8.6 million hours could be saved a week. That corresponds to 45 minutes per executive per day.

The potential productivity impact of every executive having 45 minutes more to work every single day is clearly significant, and considering that 55% say that their organization is either very or extremely complex, why are we then not making the reduction of complexity one or our top of mind issues?

The problem is that identifying the sources of complexity is complex in of itself. Key sources of complexity include organizational size, executive priorities, pace of innovation, decision-making processes, vastly increasing amounts of data to manage, organizational structures, and the pure culture of the company. As a consequence, answers are not universal by any means.

That being said, the negative productivity impact of complexity, regardless of the specific source, is felt similarly across a very large segment of the respondents, with 55% stating that complexity has taken a direct toll on profitability over the past three years.  This is such a serious problem that 8% of respondents actually slowed down their company growth in order to deal with complexity.

So, if complexity oftentimes impacts productivity and subsequently profitability, what are some of the more successful initiatives that companies are taking to combat these effects? Among the answers from the EIU survey, the following were highlighted among the most likely initiatives to reduce complexity and ultimately increase productivity:

  • Making it a company-wide goal to reduce complexity means that the executive level has to live and breathe simplification in order for the rest of the organization to get behind it. Changing behaviors across the organization requires strong leadership, commitment, and change management, and these initiatives ultimately lead to improved decision-making processes, which was reported by respondents as the top benefit of reducing complexity. From a leadership perspective this also requires setting appropriate metrics for measuring outcomes, and for metrics, productivity and efficiency were by far the most popular choices amongst respondents though strangely collaboration related metrics where not ranking high in spite of collaboration being a high level priority.
  • Promoting a culture of collaboration means enabling employees and management alike to collaborate not only within their teams but also across the organization, with partners, and with customers. Creating cross-functional roles to facilitate collaboration was cited by 56% as the most helpful strategy in achieving this goal.
  • More than half (54%) of respondents found the implementation of new technology and tools to be a successful step towards reducing complexity and improving productivity. Enabling collaboration, reducing information overload, building scenarios and prognoses, and enabling real-time decision-making are all key issues that technology can help to reduce complexity at all levels of the organization.

While these initiatives won’t help everyone, it is interesting to see that more than half of companies believe that if they could cut complexity in half they could be at least 11%-25% more productive. That nearly one in five respondents indicated that they could be 26%-50% more productive is a massive improvement.

The question then becomes whether we can make complexity and its impact on productivity not only more visible as a key issue for companies to address, but (even more importantly) also something that every company and every employee should be actively working to reduce. The potential productivity gains listed by respondents certainly provide food for thought, and few other corporate activities are likely to gain that level of ROI.

Just imagine having 45 minutes each and every day for actively pursuing new projects, getting innovative, collaborating, mentoring, learning, reducing stress, etc. What would you do? The vision is certainly compelling, and the question is are we as companies, leaders, and employees going to do something about it?

To read more about the EIU study, please see:

Feel free to follow me on Twitter: @michaelrander

Comments

About Michael Rander

Michael Rander is the Global Research Director for Future Of Work at SAP. He is an experienced project manager, strategic and competitive market researcher, operations manager as well as an avid photographer, athlete, traveler and entrepreneur. Share your thoughts with Michael on Twitter @michaelrander.

More Than Noise: 5 Digital Stories From 2016 That Are Bigger Than You Think

Dan Wellers, Michael Rander, Kai Göerlich, Josh Waddell, Saravana Chandran, and Stephanie Overby

These days it seems that we are witnessing waves of extreme disruption rather than incremental technology change. While some tech news stories have been just so much noise, unlikely to have long-term impact, a few are important signals of much bigger, longer-term changes afoot.

From bots to blockchains, augmented realities to human-machine convergence, a number of rapidly advancing technological capabilities hit important inflection points in 2016. We looked at five important emerging technology news stories that happened this year and the trends set in motion that will have an impact for a long time to come.

sap_q416_digital_double_feature1__1

Immersive experiences were one of three top-level trends identified by Gartner for 2016, and that was evident in the enormous popularity of Pokémon Go. While the hype may have come and gone, the immersive technologies that have been quietly advancing in the background for years are ready to boil over into the big time—and into the enterprise.

The free location-based augmented reality (AR) game took off shortly after Nintendo launched it in July, and it became the most downloaded app in Apple’s app store history in its first week, as reported by TechCrunch. Average daily usage of the app on Android devices in July 2016 exceeded that of the standard-bearers Snapchat, Instagram, and Facebook, according to SimilarWeb. Within two months, Pokémon Go had generated more than US$440 million, according to Sensor Tower.

Unlike virtual reality (VR), which immerses us in a simulated world, AR layers computer-generated information such as graphics, sound, or other data on top of our view of the real world. In the case of Pokémon Go, players venture through the physical world using a digital map to search for Pokémon characters.

The game’s instant global acceptance was a surprise. Most watching this space expected an immersive headset device like Oculus Rift or Google Cardboard to steal the headlines. But it took Pikachu and the gang to break through. Pokémon Go capitalized on a generation’s nostalgia for its childhood and harnessed the latest advancements in key AR enabling technologies such as geolocation and computer vision.

sap_q416_digital_double_feature1_images8Just as mobile technologies percolated inside companies for several years before the iPhone exploded onto the market, companies have been dabbling in AR since the beginning of the decade. IKEA created an AR catalog app in 2013 to help customers visualize how their KIVIK modular sofa, for example, would look in their living rooms. Mitsubishi Electric has been perfecting an AR application, introduced in 2011, that enables homeowners to visualize its HVAC products in their homes. Newport News Shipbuilding has launched some 30 AR projects to help the company build and maintain its vessels. Tech giants including Facebook, HP, and Apple have been snapping up immersive tech startups for some time.

The overnight success of Pokémon Go will fuel interest in and understanding of all mediated reality technology—virtual and augmented. It’s created a shorthand for describing immersive reality and could launch a wave of technology consumerization the likes of which we haven’t seen since the iPhone instigated a tsunami of smartphone usage. Enterprises would be wise to figure out the role of immersive technology sooner rather than later. “AR and VR will both be the new normal within five years,” says futurist Gerd Leonhard, noting that the biggest hurdles may be mobile bandwidth availability and concerns about sensory overload. “Pokémon is an obvious opening scene only—professional use of AR and VR will explode.”

sap_q416_digital_double_feature1__3

Blockchains, the decentralized digital ledgers of transactions that are processed by a distributed network, first made headlines as the foundation for new types of financial transactions beginning with Bitcoin in 2009. According to Greenwich Associates, financial and technology companies will invest an estimated $1 billion in blockchain technology in 2016. But, as Gartner recently pointed out, there could be even more rapid evolution and acceptance in the areas of manufacturing, government, healthcare, and education.

By the 2020s, blockchain-based systems will reduce or eliminate many points of friction for a variety of business transactions. Individuals and companies will be able to exchange a wide range of digitized or digitally represented assets and value with anyone else, according to PwC. The supervised peer-to-peer network concept “is the future,” says Leonhard.

But the most important blockchain-related news of 2016 revealed a weak link in the application of technology that is touted as an immutable record.

In theory, blockchain technology creates a highly tamper-resistant structure that makes transactions secure and verifiable through a massively distributed digital ledger. All the transactions that take place are recorded in this ledger, which lives on many computers. High-grade encryption makes it nearly impossible for someone to cheat the system.

In practice, however, blockchain-based transactions and contracts are only as good as the code that enables them.

Case in point: The DAO, one of the first major implementations of a “Decentralized Autonomous Organization” (for which the fund is named). The DAO was a crowdfunded venture capital fund using cryptocurrency for investments and run through smart contracts. The rules that govern those smart contracts, along with all financial transaction records, are maintained on the blockchain. In June, the DAO revealed that an individual exploited a vulnerability in the company’s smart contract code to take control of nearly $60 million worth of the company’s digital currency.

The fund’s investors voted to basically rewrite the smart contract code and roll back the transaction, in essence going against the intent of blockchain-based smart contracts, which are supposed to be irreversible once they self-execute.

The DAO’s experience confirmed one of the inherent risks of distributed ledger technology—and, in particular, the risk of running a very large fund autonomously through smart contracts based on blockchain technology. Smart contract code must be as error-free as possible. As Cornell University professor and hacker Emin Gün Sirer wrote in his blog, “writing a robust, secure smart contract requires extreme amounts of diligence. It’s more similar to writing code for a nuclear power reactor, than to writing loose web code.” Since smart contracts are intended to be executed irreversibly on the blockchain, their code should not be rewritten and improved over time, as software typically is. But since no code can ever be completely airtight, smart contracts may have to build in contingency plans for when weaknesses in their code are exploited.

Importantly, the incident was not a result of any inherent weakness in the blockchain or distributed ledger technology generally. It will not be the end of cryptocurrencies or smart contracts. And it’s leading to more consideration of editable blockchains, which proponents say would only be used in extraordinary circumstances, according to Technology Review.

sap_q416_digital_double_feature1__5

Application programming interfaces (APIs), the computer codes that serve as a bridge between software applications, are not traditionally a hot topic outside of coder circles. But they are critical components in much of the consumer technology we’ve all come to rely on day-to-day.

One of the most important events in API history was the introduction of such an interface for Google Maps a decade ago. The map app was so popular that everyone wanted to incorporate its capabilities into their own systems. So Google released an API that enabled developers to connect to and use the technology without having to hack into it. The result was the launch of hundreds of inventive location-enabled apps using Google technology. Today, millions of web sites and apps use Google Maps APIs, from Allstate’s GoodHome app, which shows homeowners a personalized risk assessment of their properties, to Harley-Davidson’s Ride Planner to 7-Eleven’s app for finding the nearest Slurpee.

sap_q416_digital_double_feature1_images6Ultimately, it became de rigueur for apps to open up their systems in a safe way for experimentation by others through APIs. Technology professional Kin Lane, who tracks the now enormous world of APIs, has said, “APIs bring together a unique blend of technology, business, and politics into a transparent, self-service mix that can foster innovation.”

Thus it was significant when Apple announced in June that it would open up Siri to third-party developers through an API, giving the wider world the ability to integrate Siri’s voice commands into their apps. The move came on the heels of similar decisions by Amazon, Facebook, and Microsoft, all of which have AI bots or assistants of their own. And in October, Google opened up its Google Assistant as well.

The introduction of APIs confirms that the AI technology behind these bots has matured significantly—and that a new wave of AI-based innovation is nigh.

The best way to spark that innovation is to open up AI technologies such as Siri so that coders can use them as platforms to build new apps that can more rapidly expand AI uses and capabilities. Call it the “platformication” of AI. The value will be less in the specific AI products a company introduces than in the value of the platform for innovation. And that depends on the quality of the API. The tech company that attracts the best and brightest will win. AI platforms are just beginning to emerge and the question is: Who will be the platform leader?

sap_q416_digital_double_feature1__4

In June, Swiss citizens voted on a proposal to introduce a guaranteed basic income for all of its citizens, as reported by BBC News. It was the first country to take the issue to the polls, but it won’t be the last. Discussions about the impact of both automation and the advancing gig economy on individual livelihoods are happening around the world. Other countries—including the United States—are looking at solutions to the problem. Both Finland and the Netherlands have universal guaranteed income pilots planned for next year. Meanwhile, American startup incubator Y Combinator is launching an experiment to give 100 families in Oakland, California, a minimum wage for five years with no strings attached, according to Quartz.

The world is on the verge of potential job loss at a scale and speed never seen before. The Industrial Revolution was more of an evolution, happening over more than a century. The ongoing digital revolution is happening in relative hyper speed.

No one is exactly sure how increased automation and digitization will affect the world’s workforce. One 2013 study suggests as much as 47% of the U.S workforce is at risk of being replaced by machines over the next two decades, but even a conservative estimate of 10% could have a dramatic impact, not just on workers but on society as a whole.

The proposed solution in Switzerland did not pass, in part because a major political party did not introduce it, and citizens are only beginning to consider the potential implications of digitization on their incomes. What’s more, the idea of simply guaranteeing pay runs contrary to long-held notions in many societies that humans ought to earn their keep.

Whether or not state-funded support is the answer is just one of the questions that must be answered. The votes and pilots underway make it clear that governments will have to respond with some policy measures. The question is: What will those measures be? The larger impact of mass job displacement, what future employment conditions might look like, and what the responsibilities of institutions are in ensuring that we can support ourselves are among the issues that policy makers will need to address.

New business models resulting from digitization will create some new types of roles—but those will require training and perhaps continued education. And not all of those who will be displaced will be in a position to remake their careers. Just consider taxi drivers: In the United States, about 223,000 people currently earn their living behind the wheel of a hired car. The average New York livery driver is 46 years old, according to the New York City Taxi and Limousine Commission, and no formal education is required. When self-driving cars take over, those jobs will go away and the men and women who held them may not be qualified for the new positions that emerge.

As digitization dramatically changes the constructs of commerce and work, no one is quite sure how people will be impacted. But waiting to see how it all shakes out is not a winning strategy. Companies and governments today will have to experiment with potential solutions before the severity of the problem is clear. Among the questions that will have to be answered: How can we retrain large parts of the workforce? How will we support those who fall through the cracks? Will we prioritize and fund education? Technological progress and shifting work models will continue, whether or not we plan for their consequences.

sap_q416_digital_double_feature1__2

In April, a young man, who was believed to have permanently lost feeling in and control over his hands and legs as the result of a devastating spine injury, became able to use his right hand and fingers again. He used technology that transmits his thoughts directly to his hand muscles, bypassing his injured spinal cord. Doctors implanted a computer chip into the quadriplegic’s brain two years ago and—with ongoing training and practice—he can now perform everyday tasks like pouring from a bottle and playing video games.

The system reconnected the man’s brain directly to his muscles—the first time that engineers have successfully bypassed the nervous system’s information superhighway, the spinal cord. It’s the medical equivalent of moving from wired to wireless computing.

The man has in essence become a cyborg, that term first coined in 1960 to describe “self-regulating human-machine systems.” Yet the beneficiary of this scientific advance himself said, “You’re not going to be looked on as, ‘Oh, I’m a cyborg now because I have this big huge prosthetic on the side of my arm.’ It’s something a lot more natural and intuitive to learn because I can see my own hand reacting.”

As described in IEEE Spectrum, the “neural-bypass system” records signals that the man generates when thinking about moving his hand, decodes those signals, and routes them to the electric sleeve around his arm to stimulate movement: “The result looks surprisingly simple and natural: When Burkhart thinks about picking up a bottle, he picks up the bottle. When he thinks about playing a chord in Guitar Hero, he plays the chord.”

sap_q416_digital_double_feature1_images5What seems straightforward on the surface is powered by a sophisticated algorithm that can analyze the vast amounts of data the man’s brain produces, separating important signals from noise.

The fact that engineers have begun to unlock the complex code that controls brain-body communication opens up enormous possibilities. Neural prostheses (cochlear implants) have already reversed hearing loss. Light-sensitive chips serving as artificial retinas are showing progress in restoring vision. Other researchers are exploring computer implants that can read human thoughts directly to signal an external computer to help people speak or move in new ways. “Human and machine are converging,” says Leonhard.

The National Academy of Engineering predicts that “the intersection of engineering and neuroscience promises great advances in healthcare, manufacturing, and communication.”

Burkhart spent two years in training with the computer that has helped power his arm to get this far. It’s the result of more than a decade of development in brain-computer interfaces. And it can currently be used only in the lab; researchers are working on a system for home use. But it’s a clear indication of how quickly the lines between man and machine are blurring—and it opens the door for further computerized reanimation in many new scenarios.

This fall, Switzerland hosted its first cyborg Olympics, in which disabled patients compete using the latest assistive technologies, including robot exoskeletons and brainwave-readers. Paraplegic athletes use electrical simulation systems to compete in cycling, for example. The winners are those who can control their device the best. “Instead of celebrating the human body moving under its own power,” said a recent article in the IEEE Spectrum, “the cyborg games will celebrate the strength and ingenuity of human-machine collaborations.” D!

Read more thought provoking articles in the latest issue of the Digitalist Magazine, Executive Quarterly.

Comments

About Dan Wellers

Dan Wellers is the Global Lead of Digital Futures at SAP, which explores how organizations can anticipate the future impact of exponential technologies. Dan has extensive experience in technology marketing and business strategy, plus management, consulting, and sales.

Tags:

The Future Of Work Is Now

Stefan Ries

Far beyond collaboration, the digitization of work determines how we work and engage people. Technologies – such as artificial intelligence, machine learning, robotics, analytics, and cloud technologies – change the way we recruit, develop talent, and make our workforce more inclusive. They also introduce new jobs, largely with different skill set requirements. Some of the most-wanted jobs today did not exist five years ago – and many jobs we wouldn’t even imagine today will arise in the near future. Our workplace is changing at light speed.

“Beyond collaboration, the digitization of work determines how we work and engage people”

Technology accelerates the transformation of businesses and industries. We need to prepare our businesses for the future, anticipate skills requirements and workforce changes. While some of the developments are unpredictable, it is up to thought and industry leaders like us to take control and shape the future of work.

SAP Future Factor, an interactive Web series: Engaging with thought leaders about the future of work

Welcome to the SAP Future Factor Web Salon, an interactive Web series featuring perspectives of thought leaders from academia, business, and government about the workplace of the future. The series drives a continuous exchange about the impacts of digitization on organizations and shares insight on innovative practices already in place.

The inaugural episode features SAP chief human resources officer Stefan Ries and Kevin Kruse, leadership expert and author of the New York Times best-seller “We: How to Increase Performance and Profits Through Full Engagement.” The two thought leaders exchange views on the opportunities and challenges of a digitized workplace and business culture. Their discussion will touch on the rising digital workplace, new ways to collaborate, the role technology plays to foster diversity and inclusion, employee engagement, and talent development.

Choose the topics that match your needs

Tomorrow’s workplace is all about choices – and so is the format of the SAP Future Factor Web series. All episodes are fully interactive, giving you the opportunity to interact with the content of the video by choosing topics of interest to you and your business. You determine what you would like to view and learn about, and in what order.

Episode 1 features the following topics:

  • Impacts of Digitization
  • HR’s Role in a Digitized World
  • Cloud Culture
  • Business Beyond Bias
  • Man vs. Machine
  • Rise of Social Intelligence

The future is now. Engage with us in the SAP Future Factor!

We hope you will enjoy the first episode. Tell us what you think.

Are the biggest trends from the last year on your radar screen? See More Than Noise: 5 Digital Stories From 2016 That Are Bigger Than You Think.

Comments

Stefan Ries

About Stefan Ries

Stefan Ries is Chief Human Resources Officer (CHRO), Labor Relations Director, and a member of the Executive Board of SAP SE. Stefan was born in Bavaria and raised in Constance, Germany, where he spent most of his youth. After receiving his masters of business in economics from the University of Constance in 1991, he moved to Munich. He started his career as HR Manager at Microsoft, overseeing HR duties in Austria, Switzerland, and East European countries. In July 1994, he went on to lead the HR function for Compaq Computer in Europe, Middle East, and Africa. Following the company’s acquisitions of Tandem Computers and Digital Equipment Corporation in 1999 and 2000, Stefan led the entire HR organization for Compaq in Germany. Stefan first joined SAP in 2002 and later became responsible for various HR functions, heading up the HR business partner organization and overseeing all HR functions on an operational level. To support innovation, Stefan attaches great importance to a diverse working culture. He is convinced that appreciating the differences among people, their unique backgrounds and personalities is a key success factor for SAP.