Promoting collaboration across the theoretical sciences
neuron star.jpg

Calendar

Back to All Events

New Postdocs talk at ITS

  • Room 5209 365 5th Avenue New York, NY, 10016 United States (map)

New Postdocs talk at ITS


Join us for two talks by the new incoming postdocs at ITS and lunch in between.

Daniel Schubring (11am):

Title: Renormalons in 2D asymptotically free theories.

Abstract: Divergent perturbation series are ubiquitous in physics and often they can be treated by resummation techniques like Borel summation. But in asymptotically free theories like QCD there may be obstructions to resummation referred to as "renormalons." If the perturbation series is extended to a "transseries" including information about non-perturbative sectors then the renormalons should mutually cancel. This is shown explicitly for certain 2D non-linear sigma models including the n-vector model common in statistical physics.

Francesca Mignacco (1pm):

Title: "Statistical physics insights into stochastic gradient descent"

Abstract: Artificial neural networks (ANNs) trained with the stochastic gradient-descent (SGD) algorithm have achieved impressive performances in a variety of applications. However, the theory behind this practical success remains largely unexplained. A general consensus has arisen that the answer requires a detailed description of the trajectory traversed during training. This task is highly nontrivial for at least two reasons. First, the high dimension of the parameter space where ANNs typically operate defies standard mathematical techniques. Second, SGD navigates a non-convex loss landscape following an out-of-equilibrium dynamics with a complicated state-dependent noise. In this talk, I will consider prototypical learning problems that are amenable to an exact characterization. I will show how dynamical mean-field theory from statistical physics can be used to derive an effective low-dimensional description of the network performance and the learning dynamics of multi-pass SGD. Finally, I will discuss how different sources of algorithmic noise affect the performance of the network.