ITS @ CUNY

Calendar

Filtering by: Seminar
Condensed Matter Physics Seminar Series: Jie Wang
Oct
18
11:00 AM11:00

Condensed Matter Physics Seminar Series: Jie Wang

  • Room 5209 at The Graduate Center, CUNY (map)
  • Google Calendar ICS

Friday, October 18th at 11 am
Room 5209 at The Graduate Center, CUNY

Jie Wang, Flatiron Institute
https://www.simonsfoundation.org/team/jie-wang/

"Emergent Dirac fermions in Composite Fermi Liquids".

Abstract: Interacting electrons in high magnetic fields exhibit rich physical phenomena including the gapped fractional quantum Hall effects and the gapless states. The composite Fermi liquids (CFLs) are gapless states that can occur at even denominator Landau level fillings. Due to the celebrated work of Halperin, Lee and Read (94), the CFLs were understood as Fermi liquids of composite fermions, which are bound states of electrons and electromagnetic flux quanta. However, at 1/2 filling, it is not obvious why the HLR description is consistent with the particle hole symmetry. Motivated by this, recently Son (15) proposed an alternative description for CFLs at 1/2, according to which the composite fermions are instead emergent Dirac fermions. Importantly, Son’s theory predicts a Pi Berry curvature singularity at the composite Fermi sea center. In this talk, I will focus on the Berry phase aspect of CFLs, and talk about the emergent Dirac fermions at low energy at one-half and other generic filling fractions.

View Event →

J Sethna: Sloppy models, differential geometry, and the space of model predictions
Sep
13
11:00 AM11:00

J Sethna: Sloppy models, differential geometry, and the space of model predictions

  • Room 5209 at The Graduate Center, CUNY (map)
  • Google Calendar ICS

James P. Sethna, Katherine Quinn, Archishman Raju, Mark Transtrum, Ben Machta, Ricky Chachra, Ryan Gutenkunst, Joshua J. Waterfall, Fergal P. Casey, Kevin S. Brown, Christopher R. Myers
Cornell University

Models of systems biology, climate change, ecology, complex instruments, and macroeconomics have parameters that are hard or impossible to measure directly. If we fit these unknown parameters, fiddling with them until they agree with past experiments, how much can we trust their predictions? We have found that predictions can be made despite huge uncertainties in the parameters – many parameter combinations are mostly unimportant to the collective behavior. We will use ideas and methods from differential geometry and approximation theory to explain sloppiness as a ‘hyper-ribbon’ structure of the manifold of possible model predictions. We show that physics theories are also sloppy – that sloppiness may be the underlying reason why the world is comprehensible. We will present new methods for visualizing this model manifold for probabilistic systems – such as the space of possible universes as measured by the cosmic microwave background radiation.

Seminar with:

James Sethna, Cornell University
http://sethna.lassp.cornell.edu/

View Event →
May
16
4:15 PM16:15

Three-dimensional magnetohydrodynamics system forced by space-time white noise 

  • Room 6496 at The Graduate Center, CUNY (map)
  • Google Calendar ICS

Kazuo Yamazaki, University of Rochester 

Abstract: The magnetohydrodynamics system consists of the Navier-Stokes equations forced by Lorentz force, coupled with the Maxwell's equations from electromagnetism. This talk will be relatively expository about the direction of research on stochastic PDE forced by space-time white noise, with a new result on the three-dimensional magnetohydrodynamics system forced by space-time white noise. In short, the fact that the noise is white in not only time but also space forces the solution to become extremely rough in spatial variable, its regularity akin to those of distributions, so that it becomes difficult for the non-linear term to become well-defined in any classical sense because there is no universal agreement on a product of a distribution with another distribution. Our discussion should also include following systems of equations: Kardar-Parisi-Zhang equation, Boussinesq system. The following notions and techniques may also be included in our discussions: Feynman diagrams, local subcriticality, paracontrolled distributions, renormalizations, regularity structures, rough path theory, Wick products, Young's integral. 

Part of the Non-Linear Study Group. For more info, see https://www.math.csi.cuny.edu/~mlucia/GCactivities.html

View Event →
May
9
4:15 PM16:15

A spiral interface with positive Alt-Caffarelli-Friedman limit at the origin 

  • Room 6496 at The Graduate Center, CUNY (map)
  • Google Calendar ICS

Dennis Kriventsov, Rutgers University 

Abstract: I will discuss an example of a pair of continuous nonnegative subharmonic functions, each vanishing where the other is positive, which have a strictly positive limit for the Alt-Caffarelli-Friedman monotonicity formula at the origin, but for which the origin is not a point of differentiability for the boundary of their supports. Time permitting, I will also discuss some further progress on related problems.This is based on joint work with Mark Allen. 

Part of the Non-Linear Study Group. For more info, see https://www.math.csi.cuny.edu/~mlucia/GCactivities.html

View Event →
May
6
2:00 PM14:00

The information bottleneck theory of deep learning, and the computational benefits of hidden layers

  • The Science Center at the The Graduate Center, CUNY (Rm 4102) (map)
  • Google Calendar ICS

Naftali Tishby, Hebrew University of Jerusalem

 Abstract:
In the past several years we have developed a comprehensive theory of large scale learning with Deep Neural Networks (DNN), when optimized with Stochastic Gradient Decent (SGD). The theory is built on three theoretical components: (1) rethinking the standard (PAC like) distribution independent worse case generalisation bounds - turning them to problem dependent typical (in the Information Theory sense) bounds that are independent of the model architecture. 

(2) The Information Plane theorem: For large scale typical learning the sample-complexity and accuracy tradeoff is characterized by only two numbers: the mutual information that the representation (a layer in the network) maintain on the input patterns, and the mutual information each layer has on the desired output label. The Information Theoretic optimal tradeoff between thees encoder and decoder information values is given by the Information Bottleneck (IB) bound for the rule specific input-output distribution.  (3) The layers of the DNN reach this optimal bound via standard SGD training, in high (input & layers) dimension.

In this talk I will briefly review these results and discuss two new surprising outcomes of this theory: (1) The computational benefit of the hidden layers, (2) the emerging understanding of the features encoded by each layers which follows from the convergence to the IB bound.

Based on joint works with Noga Zaslavsky, Ravid Ziv, and Amichai Painsky.

Naftali Tishby is the Ruth & Stan Flinkman Professor in Brain Research at the Hebrew University of Jerusalem, where he is a member of The Benin School of Computer Science and Engineering and The Edmond and Lilly Safra Center for Brain Sciences. Educated as physicist, he has made profound contributions to problems ranging from chemical reaction dynamics to speech recognition, and from natural language processing to the dynamics of real neural networks in the brain. In the late 1980s Tishby and colleagues recast learning in neural networks as a statistical physics problem, and went on to discover that learning in large networks could show phase transitions, as exposure to increasing numbers of examples “cools” the parameters of the network into a range of values that provides qualitatively better performance. Most recently he has emerged as one of the leading figures in efforts to understand the success of deep learning, and this will be the topic of his seminar.

We have set aside two hours, in the hopes of encouraging greater interaction and discussion.
Download the event flier here.

Sponsored by the Initiative for the Theoretical Sciences, and by the CUNY doctoral programs in Physics and Biology.

Supported in part by the Center for the Physics of Biological Function, a joint effort of The Graduate Center and Princeton University

View Event →
Apr
11
4:15 PM16:15

Filling metric spaces

  • Room 6496 at The Graduate Center CUNY (map)
  • Google Calendar ICS

Nonlinear Analysis and PDEs
Goals of these seminars are to discuss techniques that are used nonlinear problems arising in applied mathematics, physics or differential geometry.

Alexander Nabutovsky, University of Toronto 
Filling metric spaces 
Abstract: Uryson k-width of a metric space X measures how close X is to being k-dimensional. Several years ago Larry Guth proved that if M is a closed n-dimensional manifold, and the volume of each ball of radius 1 in M does not exceed a certain small constant e(n), then the Uryson (n-1)-width of M is less than 1. This result is a significant generalization of the famous Gromov's inequality relating the volume and the filling radius that plays a central role in systolic geometry. Guth asked if a much stronger and more general result holds true: Is there a constant e(m)>o such that each compact metric space with m-dimensional Hausdorff content less than e(m) always has (m-1)-dimensional Uryson width less than 1? Note that here the dimension of the metric space is not assumed to be m, and is allowed to be arbitrary. Such a result immediately leads to interesting new inequalities even for closed Riemannian manifolds. In my talk I am are going to discuss a joint project with Yevgeny Liokumovich, Boris Lishak and Regina Rotman towards the positive resolution of Guth's problem. 

These events are sponsored by the Initiative for the Theoretical Sciences. For more information go to https://www.math.csi.cuny.edu/~mlucia/GCactivities.html.

Those participating in the Nonlinear Analysis and PDE seminar may also be interested in the Geometric Analysis Seminar which meets Tuesdays in the same room 6496 starting at 3pm.

View Event →
Apr
5
11:00 AM11:00

Marcos Rigol: Emergent eigenstate solution to quantum dynamics far from equilibrium

  • Room 5209, The Graduate Center, CUNY (map)
  • Google Calendar ICS

F Apr 5 11am Room 5209

Marcos Rigol, Pennsylvania State University

Quantum dynamics of interacting many-body systems has become a 
unique venue for the realization of novel states of matter. In this talk, we 
discuss how it can lead to the generation of time-evolving states that are 
eigenstates of emergent local Hamiltonians, not trivially related to the ones 
dictating the time evolution. We study geometric quenches in fermionic and 
bosonic systems in one-dimensional lattices, and provide examples of 
experimentally relevant time-evolving states [1,2] that are either ground 
states or highly excited eigenstates of emergent local Hamiltonians [3]. We 
also discuss the expansion of Mott insulating domains at finite temperature. 
Surprisingly, the melting of the Mott domain is accompanied by an effective 
cooling of the system [4]. We explain this phenomenon analytically using the 
equilibrium description provided by the emergent local Hamiltonian [4,5].

Part of the Condensed matter physics seminar series
Organizers: Sarang Gopalakrishnan & Tankut Can

Click here for full series printable PDF.

View Event →
Feb
15
11:00 AM11:00

Complexity of Linear Regions in Deep Networks

  • Room 5209, The Graduate Center, CUNY (map)
  • Google Calendar ICS

F Feb 15 11am Room 5209

Boris Hanin, Texas A&M University
Organizers: Sarang Gopalakrishnan & Tankut Can

More info:
Boris Hanin is a mathematician work on deep learning and mathematical physics. Before joining the faculty in the Math Department at Texas A&M in 2017, he was an NSF Postdoc in Math at MIT. He is currently a Visiting Scientist at Facebook AI Research in NYC.

“I will present several new results, joint with David Rolnick, about the number of linear regions and the sizes of the boundaries of linear regions in a network N with piecewise linear activations and random weights/biases.

I will discuss a new formula for the average complexity of linear regions that holds even for highly correlated weights and biases, and hence is valid throughout training. It shows, for example, that at initialization, the number of regions along any 1D line grows like the number of neurons in N. In particular, perhaps surprisingly, it is this number is not exponential in the depth of the network. 

I will explain the analog of this result for higher input dimension and will report on a number of experiments, which demonstrate empirically that our precise theorems at initialization can be expected to hold qualitatively throughout training.”

Click here for full series printable PDF.

View Event →