statistical physics and machine learning
june 25 and 26
In the past decade, progress in machine learning (ML) and artificial intelligence has moved at breakneck speed, with advances in nearly every domain, including object recognition, natural language processing, and robotics. There is a long history of cross-fertilization between statistical physics and ML, but the past few years in particular have seen enormous growth of activity at this interface as great progress in ML leaves a trail of remarkable phenomena that demand theoretical explanation. As neural networks are ever more frequently deployed in diverse applications ranging from basic science and medicine to security, understanding how and when they work is increasingly urgent. Of course, greater understanding will also allow us to further improve the performance of these models. Statistical physics is an ideal framework for studying neural networks because their power emerges from a large number of interacting degrees of freedom.
In these lectures, I will first provide an introduction to the core concepts and tools of machine learning in a manner intuitive to physicists. We will cover fundamental concepts in ML and modern statistics such as the bias–variance decomposition, overfitting, regularization, generalization, and gradient descent before moving on to more advanced and modern topics in both supervised and unsupervised learning. I will emphasize results where techniques from statistical physics have been especially fruitful and discuss areas ripe for the future.
About the lecturer
David J. Schwab is Assistant Professor of Biology and Physics at the Graduate Center, CUNY and a member of the Initiative for the Theoretical Sciences. He is a Simons Investigator in the Mathematical Modeling of Living Systems and a Sloan Fellow in Physics. He is interested in the physics of learning, natural and artificial.
David can be reached with questions and comments at davidjschwab@gmail.com.