Download as iCal file

Applied and Computational Math Seminar

On the Convergence of Stochastic Gradient Descent and Its Variants for Inverse Problems

Zehui Zhou (Rutgers University)

Location:  Hill 705
Date & time: Wednesday, 24 April 2024 at 11:00AM - 12:00PM

In this talk, I will present stochastic gradient type methods for solving large-scale ill-posed inverse problems which arise naturally in many real-world applications, especially parameter identifications for partial differential equations.
Among existing techniques, iterative regularization represents a very powerful class of solvers, e.g., the Landweber method. Stochastic gradient descent (SGD), a randomized version of the classical Landweber method, is very promising for solving large-scale inverse problems, due to its excellent scalability with respect to the problem size. However, the properties of SGD for solving inverse problems remain poorly understood, despite its computational appeals. Furthermore, SGD can suffer from an undesirable saturation for inverse problems with smooth solutions. In this talk, I will present the convergence analysis of SGD for nonlinear inverse problems and methods for addressing the saturation phenomenon of SGD for linear inverse problems.