It’s increasingly clear that we, as humans, continuously upload our identities every day. Those needs and propensities become quantified and contextualized. For companies like IBM Watson, understanding the human condition is important so technology can increasingly define patterns, learn, and potentially predict outcomes that benefit both business and industry.
We were pleased to host Rob High, IBM Fellow, vice president and chief technology officer, Watson Solutions, IBM Software Group. In this episode, Rob talks about:
- The definition of cognitive computing
- How Watson is aiding the advancement of health care
- Chef Watson and recommendation on recipes
- Advanced cognitive systems and how they’re applied across different mediums
- The future of AI – Should humans be fearful?
You can listen to the podcast here or catch the episode here on Libsyn.
What is cognitive computing?
- Ultimately, cognitive computing has the greatest benefit for people. By definition, it is the interpretation of the human condition that includes all those things that we take in every day: the information, our communication. It deciphers the intent we derive from them that is meaningful and used in the way we make decisions in our everyday lives.
- Cognitive computing augments our own human cognition and gives us the insight and inspiration to those specific things we need to know to do our job better.
Classical computing methods have been unable to understand the underlying intent in how we, as humans, have communicated with each other, through voice and text, audible or written.
Cognitive computing does not replace human thinking. It does the research for you so you can do your thinking better.
How is Watson making strides in healthcare?
Watson can operate only in digital form, aggregating the information and looking at discreet elements to explicitly understand the various treatment options to better inform decision making. Massive amounts of data will uncover trends across the population and yield certain correlations that may help interpret and predict patient response to various treatments.
Through Watson’s work with MD Anderson Cancer Centre, the Oncology Expert Advisor (OEA) was launched.
By pulling together and analyzing vast amounts of information from patient and research databases, the OEA is expected to help our care teams identify and fine-tune the best possible cancer treatments for our patients, while also alerting them to problems that arise during a patient’s care.
In accessing millions of patient records, Watson can aid in identifying a micro-segmentation of the population that have common traits; i.e., exposure to environmental impacts, genetics, heritage, and symptoms. These will aid in surfacing the opportunities to apply the knowledge and understanding to determine how well someone with the same exposure will respond to certain treatments.
While health information across the world has been fragmented, Watson can aid in processing massive quantities of information (not humanly possible) to create implications in a meaningful way, and in a short period of time. Now doctors and patient caregivers who have documented success can share that information with other medical practitioners across the globe to accelerate diagnosis and treatment.
Chef Watson: “Ready to do some cognitive cooking?”
This was for me the most fascinating part of the segment: Chef Watson enables people to make decisions about menus, identifying and helping us discover new recipes based on our unique preferences.
At IBMchefwatson.com,Watson partnered with Bon Appetit, which provided 9000 recipes for Watson to ingest and learn about the different types and styles of recipes. For a computer which innately has no sense of palette or smell, Watson learned about the taste makeup and flavors and the feeling that results when you consume a particular dish. It also learns about the science of taste chemistry and the chemical compounds that give the recipes their specific tastes. From this perspective, it has the ability to begin to imitate the human senses. As per Rob:
Watson starts from scratch, dealing with many – potentially up to a quintillion– combinations of ingredients when it comes up with its unique recommendation every time.
It’s getting at the root of what makes people who they are – the things we experience are interpretable.
As an example, if you wanted a Belgian flavor for a given recipe, Watson will evaluate the different combinations of ingredients that pair well and produce a Belgian flavor, and may come up with different variations.
Starting out as a fun and interesting project, this has occurred as a result of the cognitive ability and has allowed Watson to venture into the art of the possible.
Patterns and the evolution of interpretations
Similar to the learnings with MD Anderson, there are trends or patterns within the data where we can derive the greatest understanding or intention. Overlay contextual history which informs more of the human understanding. Collectively these allow us to extract meaning. Cognitive systems draw meaning that can bring the right set of information to humans and attention to just the right thing(s) to shape the decision-making process.
Pervasive technology has been able to to process 20% of the world’s information until now. The other 80% of that data is the human condition: the spoken word, written word, music, visual representations – all interpretations of our interests and needs. This is the heart of understanding. As Rob points out:
Multi-modal is how we communicate with each other: Not only what you’re hearing, but the intonation in the voice reflects the substance of that expression that’s being conveyed. Add the cadence that punctuates these points and now we know how humans understand each other. The computer needs to understand that as well.
Cognitive systems are not based on the same mathematical models as traditional computers. Attempting to interpret the human condition is doing so in the presence of idiosyncrasies and nuances carried through conversations and other communications.
Our words, our expressions are ambiguous…
Are these models reliable?
There is “no absolute level of correctness necessary;” these results are being applied in the eyes of the beholder. The computer will need to be exposed to enough examples that it will begin to surface patterns of meaning that will allow it to work well in that context. Be prepared for the outcomes to vary by environment or time period or when new variables are introduced.
What is the future of AI? Should we, as humans, be fearful?
The potential of cognitive is vast and in the near future, the amazing strides that are introduced are evidence of the inherent benefit to our human strength and potential.
Technology will continue to progress and there will always be a risk that people and organizations will use it in nefarious ways.
Technology should not be feared. With increased understanding comes progress. It also means humans should be responsible and use it for the purposes for which it was intended.
As this information becomes for common, technology companies need to ensure safeguards are put in place to mitigate abuse to our privacy.
Rob High is an IBM Fellow, vice president and chief technology officer, Watson Solutions, IBM Software Group. He has overall responsibility to drive Watson Solutions technical strategy and thought leadership. As a key member of the Watson Solutions Leadership team, Rob works collaboratively with the Watson engineering, research, and development teams across IBM.
Want more on future tech and its effect on business? See Bring Your Robot To Work.Comments