Skip to main content

Please join us for a Statistics and DSI joint colloquium.

Wednesday, January 22
4:00pm – 5:00pm
John Crerar Library, 390

Title: Neural Network Scaling Limits

Abstract: Neural networks are remarkable families of non-linear functions that form the backbone for state-of-the-art algorithms in tasks from computer vision (self-driving cars), natural language processing (Google Translate) and reinforcement learning (AlphaGo). After giving a precise definition of what neural networks are, I will explain how important practical questions about their complexity, stability, and optimization can be recast in mathematical terms. The problems that arise are typically statistical/probabilistic in nature. I will focus on several such questions I have studied recently involving the complexity of random hyperplane arrangements and the spectral theory of products of large random matrices.

Bio: I am an Assistant Professor at Princeton ORFE (and Associated Faculty at Princeton PACM) working on deep learning, probability, and spectral asymptotics. Prior to Princeton, I was an Assistant Professor in Mathematics at Texas A&M, an NSF Postdoc at MIT Math, and a PhD student in Math at Northwestern, where I was supervised by Steve Zelditch. I am also an advisor and member of the technical staff at Foundry, an incredible AI/computing startup that seeks to orchestrate the world’s compute. Please see my CV for more information.

arrow-left-smallarrow-right-large-greyarrow-right-large-yellowarrow-right-largearrow-right-long-yellowarrow-right-smallclosefacet-arrow-down-whitefacet-arrow-downCheckedCheckedlink-outmag-glass