Bring Your Robot to Work

As debate rages over the impact of robots on jobs, an interesting thing is happening: Robots aren’t simply evolving as human replacements, they’re also becoming collaborators and extensions of human abilities.

Fascination with—and fear of—the interplay between human and machine is as old as industrialization itself and has fueled the popular imagination ever since. Think science fiction author Isaac Asimov’s three laws of robotics creating a quasi code of ethics for our potentially menacing mechanized counterparts, or director James Cameron’s cinematic series featuring an ongoing battle between humanity and a race of robot warriors.

Self-actualized automatons and an impending robo-pocalypse certainly make for great entertainment. And in an era in which we define our lives largely by the work we do, there is a natural concern that humans will be replaced by robots for nearly all endeavors, resulting in a labor-light economy in which most work is automated and people take what little is left.

Certainly, jobs have been and will continue to be displaced by robotics, automation, and artificial intelligence (see “The Jobs Question”). However, a different future is more probable, and likely more profitable, at least when it comes to robotics: human-machine coevolution. As robots take on increasingly complex tasks, new forms of humanmachine interaction will emerge and the structure of both industry and society will evolve to accommodate this emerging and symbiotic relationship.

The Jobs Question 

Are we looking at a future with vast numbers of unemployable humans? The short answer is that we don’t really know.

So far, the overall impact of combining robotics with automation and artificial intelligence has been to displace workers in routine, repetitive jobs such as meter readers and typists. However, as these technologies advance, they will take on tasks once thought too complex for machines. Taking that into account, Oxford University researchers in 2013 introduced a new way of assessing the vulnerability of jobs to technology, arguing that occupations involving perception and manipulation, creative intelligence, and social intelligence were least likely to be automated. They concluded that nearly half of total U.S. employees (47%) are at high risk of being displaced, adding that “as technology races ahead, low-skill workers will reallocate to tasks that are non-susceptible to computerization — i.e., tasks requiring creative and social intelligence. For workers to win the race, however, they will have to acquire creative and social skills.

However, a recent survey by the Pew Research Center found that while experts agreed that “robotics and artificial intelligence will permeate wide segments of daily life by 2025, with huge implications for a range of industries,” they were deeply divided about the ultimate impact on employment. Roughly half of the experts canvassed believed that automation would displace significant numbers of employees, leading to “income inequality, masses of people who are effectively unemployable, and breakdowns in the social order.” The other half had faith that while technology will displace many jobs, just as many new jobs and industries will arise to take their place, as with every technological disruption since the beginning of the industrial revolution.

Meanwhile, to date there has been little good data on the specific economic impact of robots on jobs and industry. Earlier this year, researchers at the London School of Economics and Political Science introduced the results of a study of the impact of industrial robots on manufacturing in 17 developed countries from 1993 through 2007. They found that while the robots did reduce the hours of both low- and middle-skilled workers in some instances, there was no significant effect on hours worked. In fact, the robots increased wages, total productivity, and average growth rates in those countries.

The Varied Robot Species

A number of different types of robot forms and functions are evolving today, ranging from the Roomba that vacuums your floors to the sophisticated rovers propelling themselves across the surface of Mars.

Some robotic machines will act as manual laborers with limited intelligence, performing work that is too difficult or undesirable for the rest of us, with or without the input of human operators. Other robots will have higher levels of autonomy and independent decision-making capabilities and will require varying levels of human involvement. Another significant category consists of diverse human-machine units— often called wearable robotics—which serve as a robotic extensions of the human form. They include exoskeletons, bodysuits, and bionic limbs that expand human capabilities.

Each type will have its place in our daily lives, though it’s unlikely that we’ll be pointing them out as robots. “At a certain point,” says Brett Kennedy, group supervisor of the robotics group at NASA’s Jet Propulsion Laboratory (JPL), “they just become tools. When you look at the Roomba, I think people just think of that as their vacuum cleaner, which happens to be robotic. And we’re going to see more things like that.”

Robots as Helpers

2016_Q1_feature3_0000_01Robots have been in use on factory floors for some time, but additional value lies in coordinating the input of humans and machines in the enterprise. Amazon’s newest fulfillment center in Robbinsville, NJ, for example, hums along powered by the collaboration of human and machine, reports Kevin Shea for Compact robots move shelves across the floor to human workers who retrieve products or restock them, increasing efficiency and productivity.

NASA’s JPL in Pasadena, CA, developed an ape-like limbed robot, capable of both mobility and manipulation, to perform tasks too dangerous for humans in disaster-response situations. While this RoboSimian is capable of using its perception systems, stereo cameras, and Light Detection and Ranging (LIDAR) device to create 3D maps of its environment or feel around the terrain using four sensors on its wrists, a human operator ultimately decides where it should go and what tasks it should perform.

“The point is not to make humans obsolete but to take the stuff that’s simple enough for a robot to do and let the robot do it,” says Doug Stephen, research associate at the Institute for Human and Machine Cognition, University of West Florida. “As a result you free up the people involved with the process to do the things that human beings are better at, leveraging their creativity, intuition, and intelligence.”2016_Q1_feature3_sidebar_02

The Quest for Autonomous Robots

It’s unlikely that we’ll all be coming home to a Rosie the Robot like the Jetsons did anytime soon. Right now, it takes an incredible amount of effort to program a limbed robot that can recover from a fall, for example. A person who trips on a crack in the sidewalk naturally knows how to recover and not topple over. A robot, on the other hand, can’t do that quickly or reliably. The problem is that our understanding of such human capabilities remains limited.

“A lot of it is figuring out how to formally define the processes that human beings or any legged animal would use to maintain balance or to walk, and then tell a robot how to do it. You have to be really explicit in the instructions that you give to these machines,” Stephen says. There’s a reason Amazon launched an open challenge to create a robot capable of picking and packing boxes for shipment as well as a human being does—because it’s incredibly hard to do.

In 2010, DARPA launched its four-year Autonomous Robotic Manipulator Challenge to create a machine capable of carrying out complex tasks with only high-level human involvement. Some robots completed the challenge, but “they were really, really, really slow,” says Kennedy. “We may get to a point where robots can do these sorts of things on their own,” says Kennedy. “But they’re just not as good as people at this point.”

The Mars rover Curiosity, which Kennedy worked on, is capable of “supervised autonomy” because there was no way to control a robot 141 million miles away with a joystick. Human operators give the robot tasks to do once or twice a day, and the rover does its best to execute. “They’re autonomous to a certain degree,” Kennedy says. “But they’re really just executing the intent of the operator here on the ground.”2016_Q1_feature3_0005_05

Robots as Human Extensions

Another hot segment of robotics research today is in the area of artificial extensions of the human form, such as suits or exoskeletons that strengthen arms, legs, and night vision and that offer other sensory enhancements.

2016_Q1_feature3_0003_03Researchers at New York University and the University of West Florida’s Institute for Human and Machine Cognition are developing lower-extremity exoskeletons for the disabled. At the University of California, Berkeley, robotics experts are testing a similar device designed to help paraplegics walk. Harvard’s Wyss Institute for Biologically Inspired Engineering is working on a fabric exosuit that works to “reduce injuries, improve stamina, and enhance balance even for those [people] with weakened muscles,” according to ExtremeTech.

2016_Q1_feature3_0005_04The U.S. military is tapping industry to develop a special outfit for commandos. Dubbed the “Iron Man” suit, it could include “superhuman strength, sensors that respond directly to brain functions, and liquid armor,” according to The Washington Post. A team of researchers at Harvard University has created a prototype for a smart suit that makes its wearer faster, more agile, and better able to move heavy objects. A device called Tacit helps the visually impaired actually feel objects around them. Attached to the wrist, it uses ultrasound waves to scan the surrounding area and delivers pressure to the wearer when objects get too close. Meanwhile, according to CNET, a German programmer has invented a way to pilot a camera-mounted drone by using just his head. (Read Opportunity Matrix and Disrupters in this issue to learn about other devices like these.)

What Companies Should Do Next

Rather than fearing the arrival of robot overlords, companies should start thinking about how they might incorporate human-machine interaction. Here are some ways to start:

  • Explore the possibilities. Companies must experiment with the latest sensor and robotic technologies as they emerge. Enterprises can pilot and implement virtual and augmented reality in production and supply chains. “The very best things that we’ve been able to produce have come from people having the tools and then figuring out how they can be used,” says Kennedy. “I don’t think we understand the future well enough to be able to predict exactly how things are going to be used, but I think we can say that certainly they will be used.”
  • Develop specific usage scenarios. Companies thinking ahead will develop future scenarios and strategies built on their unique business models and industry requirements. For example, healthcare companies are developing use cases for bionic suits for patients in rehab or the next robotic technologies for surgery. Agricultural companies are looking into robots for eliminating weeds without pesticides. Environmental research firms might explore machines to help with ocean exploration. Manufacturers can assess the next generation of robots capable of safely working alongside humans on a factory floor.
  • Focus on improvements at the process level. Businesses should also continue to digitize those business operations ripe for automation and identify those processes that benefit from human advantages like creativity and problem solving. In order to identify functions that could be enhanced by a robot-human combination, companies can invite users to come up with new design ideas and process changes. It will be critical to keep an open mind to entirely new robotic forms and functions.
  • Make robots part of the team. Companies should incorporate robots designed not just for task work but also for teamwork. Robotic engineers place a premium on something they call OPD: observability, predictability, and directability. For humans and machines to work together efficiently and safely they must each perform in a way that’s observable, predictable, and directable in order to promote their interdependence. Those goals must be at the forefront of any new interfaces, technologies, roles, or processes put in place.

The Future

As speech and real-time image recognition improve, as memory and analytics capabilities increase, and as virtual and augmented reality options advance, better, faster, and cheaper robotics options for humans will emerge. There will be a new class of sophisticated robots with defined autonomy, heightened empathy, and significant artificial intelligence. Thanks to this symbiosis, we’ll be able to take on challenging adventures like colonizing our oceans or expanding space travel in ways that neither humans nor machines can accomplish alone.

“Robots are becoming a lot more capable through advances in fields like computer vision, machine learning, and other software-driven fields,” says the University of West Florida’s Stephen. “The state of the art is advancing at an incredible pace, and it’s allowing us to develop new levels of autonomy with the existing robotic infrastructure that we already have. So you’re going to see huge gains in that over the next 10 years.”

About the author:

Kai Goerlich is Idea Director, Thought Leadership, at SAP.

Torsten Wolf is EMEA Services & Support Head IoT and Digital Experience, SAP.

Stephanie Overby is an independent writer and editor focused on the intersection of business and technology.


Executive Quarterly, Feature 3, Q1 2016