Promoting collaboration across the theoretical sciences
neuron star.jpg

Calendar

Back to All Events

Statistical physics and machine learning


Adventures in the Theoretical Sciences, ITS summer school
David Schwab lecture

Thursday, June 18 and Friday, June 19
11:00 AM - 1:00 PM (EDT)

In the past decade, progress in machine learning (ML) and artificial intelligence has moved at breakneck speed, with advances in nearly every domain, including object recognition, natural language processing, and robotics. There is a long history of cross-fertilization between statistical physics and ML, but the past few years in particular have seen enormous growth of activity at this interface as great progress in ML leaves a trail of remarkable phenomena that demand theoretical explanation. As neural networks are ever more frequently deployed in diverse applications ranging from basic science and medicine to security, understanding how and when they work is increasingly urgent. Of course, greater understanding will also allow us to further improve the performance of these models. Statistical physics is an ideal framework for studying neural networks because their power emerges from a large number of interacting degrees of freedom.

In these lectures, I will first provide an introduction to the core concepts and tools of machine learning in a manner intuitive to physicists. We will cover fundamental concepts in ML and modern statistics such as the bias–variance decomposition, overfitting, regularization, generalization, and gradient descent before moving on to more advanced and modern topics in both supervised and unsupervised learning. I will emphasize results where techniques from statistical physics have been especially fruitful and discuss areas ripe for the future. 

Click here for registration and details