For those not native to the programming world, it can get pretty confusing knowing which terms means what when it comes to artificial intelligence (in other words, which one to use in a meeting to sound like you know what you’re talking about).
Machine learning is probably the term that most of us think of when we imagine artificial intelligence, whether we realise it or not. It isn’t the same as AI, though the terms are often used interchangeably, and really just refers to a certain way in which a machine can use large amounts of data to do things we think of as smart.
A basic computer program’s intelligence comes from the programmer who wrote the instructions. It is simply code that a computer runs from start to finish. Whereas with machine learning, a computer is fed data which is used to train a mathematical model and then produce an outcome, for instance a predictive model.
A data analyst then verifies the result against more data from the same set. If the machine gets it wrong, adjustments to the model are made and the cycle runs again until the model’s results on the training data matches the model’s results using the verification data. Once tuned correctly, the machine can then use its predictive modelling on other datasets.
In some instances, computers can teach each other without the need for human intervention. In effect, they do this by learning to imitate reality.
Let’s say two computers are given a five dollar note. The first computer tries to reproduce the note, but it does so with different attributes, say initially it produces the right shape, but the wrong colour. The second computer assesses this ‘forgery’ against the real note it has, and decides no, these are not the same.
The first computer keeps trying, changing attributes each time until the second computer can’t tell the difference between the fake and the real note. In this way the first computer has learnt what makes a note a five dollar note, be it colour, shape, content, the number ‘5’ and so on. But no human ever programmed any specific rules to make it understand those facts. Amazing!
To further illuminate how machine learning works, take a look at this infographic produced by Alan Morrison and Anand Rao in their article on PwC’s Next in Tech.
Phil Bolton is a Director in PwC’s Insights team.