Yann LeCun, the director of AI research at Facebook, recently argued that ‘Deep Learning’ has out-lived its usefulness. As such, he proposed that it’s time for a new buzzword to reflect the development of a new kind of software - Differentiable Programming.
In this episode of LokadTV, we continue our mini-series by looking at Differentiable Programming’s origins and understanding where this technology actually emerged from. We discuss how the thinking behind it varies from the neural networks of old and why we are revisiting theories such as stochastic gradient descents and automatic differentiation, which have already been around for well over 50 years.
For Differentiable Programming to come about, there were a number of basic intuitions that needed to be proved completely wrong, but its development has been incredibly incremental. We understand why the classic idea of mimicking biological processes through technology leads to inefficiencies and discuss how software can have an affinity with its underlying hardware. We learn how to combine the different statistical models that are currently being used through floating points. Finally, we discuss how Differential Programming provides the ability to capture highly complex patterns and the capability of an open toolkit for optimization.
Yet, how exactly can new technological advances such as image and sound recognition, incredible though they may be, possibly be linked to supply chain optimization? Watch the video and find out.