Why The Time Is Now For AI And Machine Learning

Colm Maloney

The world of digital transformation is growing at an exponential rate, and the focus of digitalization is rapidly shifting from one technology enabler to another.

A couple of years back, it was all social, mobile, analytics, and cloud (SMAC). While these technologies are still significant enablers, they have become commonplace in digital strategies, which is driving other technologies to provide much-needed differentiation in a sea of competition and disruptors.

The current hype is all Big Data, Internet of Things (IoT), blockchain, and artificial intelligence (AI). Proof of this hype is, for example, Jamie Dimon of JP Morgan speaking against blockchain and Bitcoin, indicating the impact of these technologies on the banking industry. For another, the potential economic value enabled by these technologies is beyond impressive – one example of this is a recent report from Accenture called Why AI is the Future of Growth.

With all the commotion, I found myself wanting to understand more about AI, and more specifically machine learning (ML). When I was in college 20-plus years ago, there was a lot of discussion around AI, and I was curious about what has changed.

First, AI is nothing new; it has been around a long time:

  • The Turing test was establish by Alan Turing in 1950
  • Arthur Samuel, who coined the term “machine learning,” built the first learning machine in 1959, which beat him at checkers [or draughts if you come from Ireland]
  • Big Blue beat Garry Kasparov at chess in 1997
  • AlphaGo AI beat Ke Jie at the ancient Chinese game of Go in May 2017

But that does not explain why AI made such little progress between my college years and 2013, nor why have we made such rapid leaps recently.

One of the biggest changes is the ready availability of processing power, with Moore’s Law chugging away year after year. We have so much more processing power today than we had in my college days. Another area that is vastly different in hardware is the ability of powerful graphics processing units (GPUs) to solve complex or compute-intensive processes. For a fun understanding of how things have changed, look at this video from Nvidia.

And the promise of quantum computers makes Moore’s Law quite sedentary, once we get our collective heads around what an optimization algorithm looks like using the power of quantum computing.

Cloud is also an important enabler; you can now rent your favorite Infrastructure as a Service (IaaS) and run your most complex algorithm in the cloud without the delays of buying the hardware, hiring the infrastructure guys, installing, testing, etc. We have the agility to get started much faster than we did BC (before cloud).

When I was in university we studied algorithms, some of which are used in AI or ML today, and the underlying math of algebra, statistics, and calculus is largely the same. Linear regression, one of the mainstays of ML, is a good place to start if you want to use patterns or trends in data to predict something. Adding loss function of gradient descent will get the prediction closer to the result in your learning sample.

Finally, the abundance of data is a huge boon for machine learning – the “learning” part of ML is where Big Data comes in, where the algorithm looks back at history and learns from data to help predict the future.

Of course, all of AI is dramatically easier with the frameworks provided by software vendors like SAP, Google, and Microsoft. A framework creates, runs, and consumes ML applications through a digital innovation system. This AI platform provides so much capability out of the box that, if you intend to use machine learning on data residing in your existing solutions, this type of platform should be part of your plan.

Probably the most famous platform in AI at the moment is TensorFlow, recently made an open source library. It’s used by many, including DeepMind, which makes AlphaGo, and neuroscientists to help clinicians get patients from testing to treatment faster.

By leveraging TensorFlow’s capability alongside an in-memory database, you can deliver some incredible results. As we are a long way away from general-use AI, applying these frameworks to specific issues is where AI can add a lot of value in the short term, with several amazing use cases already in the market such as automatic payment matching, finding the best talent through intelligent job matching, and logo/brand recognition.

I hope you are as excited about the potential for AI and ML as I am. While it has been around for some time, it is still in its infancy with so much development to come. Bring it on, I say.

See how you can turn insight into action, make better decisions, and transform your business.


Colm Maloney

About Colm Maloney

Colm has worked across all elements of the Business Applications market (ERP, CRM & SCM) for 20+ years with some of the world’s largest brands developing, selling and deploying business applications with customers such as Unilever, Nokia, British American Tobacco, Caterpillar, Bank of China, DHL and Airbus Industries to name a few. Since joining SAP in New Zealand, Colm ran Solution Engineering for SAP Customer Engagement & Commerce in APJ, and recently moved into the Role of Presales Director in ANZ, looking after our Customer Solution Manager community. Prior to joining SAP, Colm worked at organization such as Microsoft, Sterling Commerce, and i2 Technologies across five continents: Europe, North America, Asia, Africa and Australia. Colm was born and raised in Ireland and has a Master’s degree in Operations Research from University College Dublin, and currently resides in Auckland, New Zealand.