skip to content

Department of Applied Mathematics and Theoretical Physics

The problem of approximating Banach-valued functions of infinitely-many variables has been studied intensively over the last decade, due to its application in, for instance, computational uncertainty quantification. Here, functions of this type arise as solution maps of various parametric or stochastic PDEs. In this talk, I will discuss recent work on the approximation of such functions from finite samples. First, I will describe new lower bounds for various (adaptive) m-widths of classes of such functions. These bounds show that any combination of (adaptive) linear samples and a (linear or nonlinear) recovery procedure can at best achieve certain algebraic of convergence with respect to the number of samples. Next, I will focus on the case where the samples are i.i.d. pointwise samples from some underlying probability measure, as is commonly encountered in practice. I will discuss methods that construct multivariate polynomial approximations via least squares and compressed sensing. As I will show, these methods attain matching upper bounds, up to polylogarithmic factors. In particular, this implies that i.i.d. pointwise samples constitute near-optimal information for this problem and these schemes constitute near-optimal methods for reconstruction from such data. Finally, time permitting, I will discuss how these results can be extended to the problem of operator learning, yielding near-optimal guarantees for learning classes of holomorphic operators related to parametric PDEs.
Co-authors: Simone Brugiapaglia (Concordia), Nick Dexter (Florida State University) and Sebastian Moraga (Simon Fraser University)

Further information

Time:

17Jul
Jul 17th 2024
09:30 to 10:10

Venue:

Seminar Room 1, Newton Institute

Speaker:

Ben Adcock (Simon Fraser University)

Series:

Isaac Newton Institute Seminar Series