ITS @ CUNY

Calendar

Filtering by: Seminar

Apr
11
4:15 PM16:15

Filling metric spaces

  • Room 6496 at The Graduate Center CUNY (map)
  • Google Calendar ICS

Nonlinear Analysis and PDEs
Goals of these seminars are to discuss techniques that are used nonlinear problems arising in applied mathematics, physics or differential geometry.

Alexander Nabutovsky, University of Toronto 
Filling metric spaces 
Abstract: Uryson k-width of a metric space X measures how close X is to being k-dimensional. Several years ago Larry Guth proved that if M is a closed n-dimensional manifold, and the volume of each ball of radius 1 in M does not exceed a certain small constant e(n), then the Uryson (n-1)-width of M is less than 1. This result is a significant generalization of the famous Gromov's inequality relating the volume and the filling radius that plays a central role in systolic geometry. Guth asked if a much stronger and more general result holds true: Is there a constant e(m)>o such that each compact metric space with m-dimensional Hausdorff content less than e(m) always has (m-1)-dimensional Uryson width less than 1? Note that here the dimension of the metric space is not assumed to be m, and is allowed to be arbitrary. Such a result immediately leads to interesting new inequalities even for closed Riemannian manifolds. In my talk I am are going to discuss a joint project with Yevgeny Liokumovich, Boris Lishak and Regina Rotman towards the positive resolution of Guth's problem. 

These events are sponsored by the Initiative for the Theoretical Sciences. For more information go to https://www.math.csi.cuny.edu/~mlucia/GCactivities.html.

Those participating in the Nonlinear Analysis and PDE seminar may also be interested in the Geometric Analysis Seminar which meets Tuesdays in the same room 6496 starting at 3pm.

View Event →
Apr
5
11:00 AM11:00

Marcos Rigol: Emergent eigenstate solution to quantum dynamics far from equilibrium

  • Room 5209, The Graduate Center, CUNY (map)
  • Google Calendar ICS

F Apr 5 11am Room 5209

Marcos Rigol, Pennsylvania State University

Quantum dynamics of interacting many-body systems has become a 
unique venue for the realization of novel states of matter. In this talk, we 
discuss how it can lead to the generation of time-evolving states that are 
eigenstates of emergent local Hamiltonians, not trivially related to the ones 
dictating the time evolution. We study geometric quenches in fermionic and 
bosonic systems in one-dimensional lattices, and provide examples of 
experimentally relevant time-evolving states [1,2] that are either ground 
states or highly excited eigenstates of emergent local Hamiltonians [3]. We 
also discuss the expansion of Mott insulating domains at finite temperature. 
Surprisingly, the melting of the Mott domain is accompanied by an effective 
cooling of the system [4]. We explain this phenomenon analytically using the 
equilibrium description provided by the emergent local Hamiltonian [4,5].

Part of the Condensed matter physics seminar series
Organizers: Sarang Gopalakrishnan & Tankut Can

Click here for full series printable PDF.

View Event →
Feb
15
11:00 AM11:00

Complexity of Linear Regions in Deep Networks

  • Room 5209, The Graduate Center, CUNY (map)
  • Google Calendar ICS

F Feb 15 11am Room 5209

Boris Hanin, Texas A&M University
Organizers: Sarang Gopalakrishnan & Tankut Can

More info:
Boris Hanin is a mathematician work on deep learning and mathematical physics. Before joining the faculty in the Math Department at Texas A&M in 2017, he was an NSF Postdoc in Math at MIT. He is currently a Visiting Scientist at Facebook AI Research in NYC.

“I will present several new results, joint with David Rolnick, about the number of linear regions and the sizes of the boundaries of linear regions in a network N with piecewise linear activations and random weights/biases.

I will discuss a new formula for the average complexity of linear regions that holds even for highly correlated weights and biases, and hence is valid throughout training. It shows, for example, that at initialization, the number of regions along any 1D line grows like the number of neurons in N. In particular, perhaps surprisingly, it is this number is not exponential in the depth of the network. 

I will explain the analog of this result for higher input dimension and will report on a number of experiments, which demonstrate empirically that our precise theorems at initialization can be expected to hold qualitatively throughout training.”

Click here for full series printable PDF.

View Event →