I know it’s a strange question.
But as with any hot topic, there are many different terms thrown around that can mean slightly different things and lead to confusion.
Here is my attempt to clear things up.
The field of artificial intelligence (AI) research dates back to 1956, but the term has never referred to a specific type of technology. Rather, it’s a sociotechnical construct that refers to machine capabilities that can solve complex tasks that were until recently only possible by humans.
So AI is an umbrella term that encompasses a lot of different technologies. Most of the current interest in AI involves machine learning – algorithms that can learn on their own from data sets without having to be explicitly programmed.
The power of machine learning has leapt exponentially because of the increased availability of data, the plummeting cost of information storage, and the ability to access massive calculation power using cloud computing and more powerful processors.
Machine learning includes automated forms of the types of statistical algorithms you may have learned in school. These have long been used inside organizations, and have typically been referred to as data science, data mining, predictive analytics, or advanced analytics.
But because these terms have been in use for a long time, they are typically avoided by analysts and vendors trying to emphasize what is different about today’s AI opportunities.
One area where we’ve seen the most progress recently is in deep learning. It’s a specific type of machine learning in which several layers of neural networks mimic the way our own brains work. It’s the key technology behind recent breakthroughs in image tagging, voice transcription, automatic translation, and more.
Cognitive computing is another term that is typically used for the most sophisticated types of AI technology that try to mimic human reasoning. But its precise meaning isn’t clear, and many analysts and vendors avoid the term because it is so strongly associated with IBM.
The market seems to have settled into the following rule of thumb: If you’re talking about the business, consumer, or personal impact of these new technologies, you use the term artificial intelligence. But if you’re talking about the details of the technology to be implemented, you use the term machine learning.
Either way, the precise definitions should never get in the way of the primary goal of getting more value from your data. You have business needs, and there are technologies available now that can help. Whenever somebody tries to suck you into a nomenclature war and insists that they have the only “true” definition of one of these terms, try to bring the subject back to the concrete situation you are working on.
For more on how data can drive business strategy, see Why Machine Learning and Why Now?