The recent surge of activity at the interface of statistical physics and machine learning has brought novel tools and ideas to both fields. Some examples include the information bottleneck appearing as a fundamental lens through which to view neural networks, the renormalization group as a conceptual basis for understanding deep learning, and the identification of phases of matter using methods of machine learning. This workshop brings together a number of researchers taking a statistical physics approach to machine learning with the intention of using insights from physics to understand learning systems.
For more info and to register, visit the event site here. Download full schedule pdf here..
Tuesday, November 13th 9:00am - Coffee & Bagels 9:30am - A Universal Jeffreys Prior - Jordan Cotler 10:00am -Machine learning for many-body quantum physics - Guiseppe Carleo 10:30am - Break 11:00am - Layer-wise greedy optimization with an eye for RG - Zohar Ringel 11:30am - Neuroscience-based machine learning - Dmitri Chklovskii 12:00pm - Lunch 2:00pm - Density estimation using field theory - Justin Kinney 2:30pm -Discrete priors on simplified models optimize channel capacity from noisy experiments - Benjamin Machta 3:00pm - Break 3:30pm -Learning Quantum Emergence with AI - Eun-Ah Kim 4:00pm - Monte Carlo Study of Small Feedforward Neural Networks - Ariana Mann
Wednesday, November 14th 9:00am - Coffee & Bagels 9:30am - Manifold Tiling with an Unsupervised Neural Net - Anirvan Sengupta 10:00am - Reinforcement Learning to Prepare Quantum States Away from Equilibrium - Marin Bukov 10:30am - Break 11:00am - Quantum control landscapes and the limits of learning - Dries Sels 11:30am - Alex Alemi 12:00pm - Lunch 2:00pm - Entropy & mutual information in models of deep neural networks - Marylou Gabrié 2:30pm - Sloppy models, Differential geometry, and How Science Works - Jim Sethna 3:00pm - Break 3:30pm - Visualizing Probabilities: Intensive Principal Component Analysis - Katherine Quinn 4:00pm - Just do the best you can: statistical physics approaches to reinforcement learning Chris Wiggins 4:30pm - Break 5:00pm - Panel Discussion
Thursday, November 15th 9:00am - Coffee & Bagels 9:30am - Which ReLU Net Architectures Give Rise to Exploding and Vanishing Gradients? - Boris Hanin 10:00am - Neural networks as interacting particle systems - Grant Rotskoff 10:30am - Break 11:00am - SGD Implicitly Regularizes Generalization Error - Dan Roberts 11:30am - Expressiveness in Deep Learning via Tensor Networks and Quantum Entanglement - Nadav Cohen 12:00pm - Normalizing Flows and Canonical Transformations - Austen Lamacraft 12:30pm - Lunch 2:00pm - Discussion
F Nov 16 9:30am-6:15pm in the Skylight Room (Rm 9100) Events begin at 9:30 AM with coffee and bagels, and conclude a bit after 6 PM. Lunch will be served. Register here: https://goo.gl/forms/NMpuJjxwY81g8ETE2
Neural mechanisms for seeing without V1 Tony Ro, The Graduate Center, CUNY
How the brain signals memories of what we’ve seen Nicole Rust, University of Pennsylvania
How high-order image statistics shape cortical visual processing Jonathan Victor, Weill Cornell School of Medicine
Using goal-driven deep neural networks to understand the visual pathway Daniel Yamins, Stanford University
Sponsored by the Initiative for the Theoretical Sciences, and by the CUNY doctoral programs in Physics and Biology. Supported in part by the Center for the Physics of Biological Function, a joint effort of The Graduate Center and Princeton University. For more information see https://biophysics.princeton.edu.