Seminars & Colloquia Calendar

Download as iCal file

Special Colloquium

Deep Learning Meets Sparse Regularization

Rahul Parhi, EPFL

Location:  Hill 705
Date & time: Friday, 19 January 2024 at 2:00PM - 3:00PM

Abstract:Deep learning has been wildly successful in practice and most state-of-the-art artificial intelligence systems are based on neural networks. Lacking, however, is a rigorous mathematical theory that adequately explains the amazing performance of deep neural networks. In this talk, I present a new mathematical framework that provides the beginning of a deeper understanding of deep learning. This framework precisely characterizes the functional properties of trained neural networks. The key mathematical tools which support this framework include transform-domain sparse regularization, the Radon transform of computed tomography, and nonlinear approximation theory, which are all deeply rooted in harmonic analysis. This framework explains the effect of weight decay regularization in neural network training, the importance of skip connections and low-rank weight matrices in network architectures, the role of sparsity in neural networks, and explains why neural networks can perform well in high-dimensional problems.

Special Note to All Travelers

Directions: map and driving directions. If you need information on public transportation, you may want to check the New Jersey Transit page.

Unfortunately, cancellations do occur from time to time. Feel free to call our department: 848-445-6969 before embarking on your journey. Thank you.