Physics for Neural Networks

REGISTRATION REQUIRED
Date
Apr 17, 2023Apr 19, 2023
Location
PCTS, Jadwin Hall, Room 407
Sponsor
PCTS
Event Description

Organizers: Boris Hanin (PU ORFE), Giorgio Cipolloni (PCTS), William Bialek (PU Physics); Francesca Mignacco

A hallmark of modem machine learning, exemplified by deep learning, is the use of exceptionally large models to fit complex datasets. While the empirical success of the neural network models used in deep learning is undeniable, a predictive formalism for describing and improving them is nascent. On the theoretical side a promising avenue for making progress is to view neural networks as large collections of (often weakly) interacting parameters. From this perspective, it is natural to study them using the tools of modern physics coming from random matrix theory, statistical mechanics, the 1/n expansion, field theory, and so on. On the experimental side, neural networks present a unique opportunity for impactful empirical investigation in which the microscopic ground truth is known to arbitrary precision. The purpose of this workshop is to bring together a range of scientists trained in a variety of disciplines related to modern physics that are interested in developing both the theory and phenomenology of deep learning.