All change begins with the ability to measure. For millennia, humans relied on our five senses to gauge the world around us in order to survive and thrive.
As civilization advanced, however, we started to use technology to expand those natural capabilities. We began building tools to measure time – first sundials and then sophisticated ones like the sky disc of Nebra and the Antikythera mechanism. Early maps on the walls of the Lascaux caves charted the night sky while the ancient Greek Anaximander drew the first map of the world. Beginning with telescopes and microscopes in the sixteenth and seventeenth centuries, and continuing today with the CERN particle collider, we’ve developed increasingly advanced tools to examine aspects of the universe beyond human perception.
Today, thanks to the explosion of low-cost sensors and high-powered processing and analytics capabilities, we are on the cusp of the next wave of magnifying our natural ability to assess the world around us and stretching the limits of human perception – if we can open our minds enough to let them.
The tiny engines driving the digital revolution
Sensors come in a variety of form factors and with wide-ranging functions: wearables like fitness and health trackers, infrared imaging and night vision sensors, motions sensors such as gyroscopes, chemical and biological sensors, accelerometers and torque sensors, light sensors, gestural sensors. Increasingly, we’re seeing combination sensors that are capable of gathering multiple types of data from the world.
Experts predict that the universe of sensors will grow exponentially in the near future, up to 100 trillion sensors by 2030, depending upon which estimate you believe. These tiny bits of technology are driving everything from robotics to self-diagnosing appliances.
A number of advancements are colluding to make these increasingly ubiquitous sensors both cheaper and more capable every day. Image, speech, and voice recognition will advance to near 100% accuracy by 2025, according to the latest published research. Today, a passive RFID tag costs between seven and 15 cents to produce. While active tags are more expensive, the cost of these is also rapidly dropping. Emerging 3D printers will enable lower cost production of sensors (and nano-sensors) to embed in day-to-day items like glasses or apparel. These sensors will quickly make their way as standard issue into many places, including the 111 million new cars and the 2 billion smartphones that will be purchased in 2020. If you have one of the latest smartphones, you already have several sensors on board, including a magnetometer, barometer, thermometer, gyroscope, proximity sensor, accelerometer, and light sensor. Indeed, many future sensors will be practically invisible to us.
Perhaps more importantly, the analytics capabilities required to make sense of the staggering amounts of new sensor data – we could be talking brontobytes, or 1,000,000,000,000 petabytes – are also rapidly advancing. The speed of analytics will intensify thirty-fold by 2030, with 95% of queries answered in mere milliseconds, according to SAP estimates. That will be critical in transforming this truly big data into smaller, digestible bites of information. The question is whether or not we can cope with it. As Professor Dr. Yvonne Förster of Luephana University in Luneburg, Germany, points out, our devices already process and deliver information much faster than our human perception can track. As most of these technology-induced rhythms run outside our awareness, it will be interesting to see how we adapt to it.
Widening the doors of perception
Our innate biological senses and nervous systems are truly amazing. The human eye contains 2 x 108 sensors, the ear 3 x 104, and the nose 3 x 107. But with an expanding network of increasingly sophisticated and embedded sensors we’ll be able to expand our perception far beyond our human capabilities. Ultimately, we’ll be able to create an intelligent matrix of sensors and analytic tools to measure, detect, and analyze more data from the world around us. Looking beyond 2025, we will advance beyond data analysis as a distinct activity to more directly experiencing data as an additional aspect of life around us. We will experience the world in much finer detail using virtual reality and other technologies that tap into our biological senses at their roots.
Ultrasound, infrared, low frequency, and position sensors will increase our vision and hearing. Chemical sensors will amplify our ability to smell and taste. Mechanosensors will intensify what we can feel. Medical and biological sensors will monitor the health and status of humans, animals, and plants. And the mix of all the above sensors will be used to monitor a wide spectrum of parameters that are critical to the operation of machines, building, and living things.
Finally, there will be sensors that help us scan our environment for more precise navigation, logistics, weather prediction, agricultural planning, and pollution management. Sensors to watch for here include voice, facial recognition, chemical, biological, and 3D imaging sensors. Occipital Inc.’s 3D sensor provides a spatial view of the environment to be used in virtual and augmented reality and 3D scanning and printing.
We’ll certainly develop algorithms and analytics necessary to process sensor data in an increasingly automated and real-time fashion. But will our minds be able to grasp it all?
An approach that keeps the human at the center might prove helpful as we adapt to the new world of high-powered sensors. According to Professor Förster, we tend to consider technology as an enhancement of our biological nature and believe we can choose which types of technology we allow into our system. But when devices and sensors are ubiquitous, technology becomes like the air we breathe, rather than being a separate part of life, explains Förster. Thus the function of sensors should be to introduce new data streams that are compatible with our existing biological and value systems.
Getting under our skin
Researchers are already developing sensor technologies that are far more embedded than in the past. And they won’t just go into “things” like smart tennis rackets or ceiling fans. Nano-engineers at the University of California, San Diego, have developed a temporary tattoo that could enable non-invasive glucose testing. The FDA has accepted an application for the first digital drug-device that combines a pill for mental illness embedded with an ingestible sensor to track data on patients. MIT scientists have introduced a “Band-Aid of the future” that incorporates temperature sensors, and tiny, drug-delivering reservoirs.
In the future we will see more sensors embedded in humans, animals, plants, and all kinds of everyday items. But how much data do we actually need – and how much can we digest? Much of this data will exist in the background where algorithms will separate the insight from the noise, while new types of sensors will allow us to interact directly with our environment.
Company leaders should consider how they could benefit by combining existing data with the new sensor data that will soon be available. They should monitor the development of sensor technology with an emphasis on where sensor technology threatens to either bypass or optimize traditional business processes, and where new sensor capabilities widen the scope of what we can measure today. And they should keep an eye outside of their own domains for sensor advances that could transform their own businesses in different ways.
There’s no doubt that our five senses will soon be supplemented by these man-made sensors numbering in the billions and capable of measuring anything that we deem to be worthwhile. But deriving business value from them will require us to open our minds to the new possibilities.
Download the executive brief: Making Sense of Sensors
To learn more about how exponential technology will affect business and life, see Digital Futures.