Promoting collaboration across the theoretical sciences
neuron star.jpg

Calendar

Back to All Events

Complexity of Linear Regions in Deep Networks

  • Room 5209, The Graduate Center, CUNY 365 5th Avenue New York, NY, 10016 United States (map)

F Feb 15 11am Room 5209

Boris Hanin, Texas A&M University
Organizers: Sarang Gopalakrishnan & Tankut Can

More info:
Boris Hanin is a mathematician work on deep learning and mathematical physics. Before joining the faculty in the Math Department at Texas A&M in 2017, he was an NSF Postdoc in Math at MIT. He is currently a Visiting Scientist at Facebook AI Research in NYC.

“I will present several new results, joint with David Rolnick, about the number of linear regions and the sizes of the boundaries of linear regions in a network N with piecewise linear activations and random weights/biases.

I will discuss a new formula for the average complexity of linear regions that holds even for highly correlated weights and biases, and hence is valid throughout training. It shows, for example, that at initialization, the number of regions along any 1D line grows like the number of neurons in N. In particular, perhaps surprisingly, it is this number is not exponential in the depth of the network. 

I will explain the analog of this result for higher input dimension and will report on a number of experiments, which demonstrate empirically that our precise theorems at initialization can be expected to hold qualitatively throughout training.”

Click here for full series printable PDF.

Earlier Event: February 8
A Sparse Model of Quantum Holography