How to use backpropagation to create a neural network out of any physical system
Peter McMahon, Cornell University
In this talk I will focus on our recent work [1] in which we describe and demonstrate a method for training essentially any physical system to act as a neural network. This has the potential to allow one to harness the complex dynamics existing in many natural systems to perform machine-learning computations for us. We have been able to train a piece of metal attached to a loudspeaker, a few-component nonlinear electronic circuit, and a second-harmonic-generation optical process each to perform MNIST handwritten-digit classification – a task that has no obvious intuitive connection with any of the physical systems used. Our method is distinct from reservoir computing in that we train tunable parameters of the physical systems, rather than treating the physical systems as fixed (untrained) reservoirs to be followed by a linear classification layer. We have made available the code [2] for our training procedure so that anyone can try our technique, and we also welcome questions and potential collaboration.