Calendar

Download as iCal file

Applied and Computational Math Seminar

Some recent advances in statistical asymptotic theory via RKHS approximations

Rui Tuo (Texas A&M University)

Location:  Hill 705
Date & time: Wednesday, 14 February 2024 at 11:00AM - 12:00PM

Reproducing Kernel Hilbert Spaces (RKHS) techniques are a cornerstone in various statistical
methods and machine learning algorithms. They provide a robust mathematical foundation
for developing new algorithms and understanding existing ones, particularly in terms of their
approximation capabilities, generalization properties, and computational efficiency. My
presentation will highlight two recent advances in statistical asymptotic theory, leveraging
RKHS approximation properties. The first part addresses the approximation efficacy of
Gaussian process regression, a Bayesian nonparametric method for reconstructing functions
from scattered data. Our research establishes its statistical asymptotic theory. We utilize
maximum inequalities for Gaussian processes to transform stochastic errors into a
deterministic framework, and latter is shown to be closely related to an approximation
problem in the RKHS. The second segment focuses on kernel ridge regression, a popular
nonparametric regression method. While its global convergence is well-documented, the
local properties of these estimators remain less understood. We analyze these estimators’
linear functionals by separating bias and variance. Remarkably, both can be reinterpreted
through RKHS approximation indicators. This leads to exact convergence rates and a central
limit theorem for various local estimators. This talk aims to provide insights into these
complex statistical tools, enhancing their accessibility and comprehension in the broader
fields of statistics and machine learning.