AI algorithms are being used to drive vehicles, translate languages and even identify cancers. As such they have captured the attention of the world’s media, but these advances in machine learning technology are not the overnight sensation that many are led to believe. In this episode of LokadTV, we go back right to the start; we investigate the gradual development of computational analysis over the last few decades and understand if this gradual evolution of the industry can give us any clues as to what the future may hold.
We discover what the first generation of computational statistical analysis, based on the parametric models, looked like. In addition, we explore how the 80s and 90s welcomed the arrival of non-parametric statistical models - which increased the chance of overfitting -, while the new millennium saw the arrival of Deep Learning, characterised by “hyper-parametric” models.
The increase in spending on research has had a real impact upon the computational capability. We discuss how something designed to support video games was actually revolutionary in terms of processing power. We also discuss the impact of technology such as the Cloud, which allowed us to store more data, more efficiently.
Finally, we talk more in depth about the key problems that the “Big Four” are currently focusing on. We debate why the supply chain industry is often a few years behind the latest advances in technology, and to wrap things up, we try to understand how an executive can keep up in an environment that is constantly changing.