(Univ Lyon, ENS de Lyon, Inria, CNRS, UCBL, LIP UMR 5668)
Title — Signal reconstruction using determinantal sampling
Abstract — We study the approximation of a square integrable function from a finite number of its evaluations at well-chosen nodes. The function is assumed to belong to a reproducing kernel Hilbert space (RKHS), the approximation to be one of a few natural finite-dimensional approximations, and the nodes to be drawn from one of two probability distributions. Both distributions are related to determinantal point processes, and use the kernel of the RKHS to favor RKHS-adapted regularity in the random design. While previous work on determinantal sampling relied on the RKHS norm, we prove mean-square guarantees in L2 norm. We show that determinantal point processes and mixtures thereof can yield fast rates, and that they shed some light on how the rate changes as more smoothness is assumed, a phenomenon known as superconvergence. Besides, determinantal sampling generalizes i.i.d. sampling from the Christoffel function, a standard in the literature. In particular, determinantal sampling guarantees the so-called instance optimality property for a smaller number of function evaluations than i.i.d. sampling.
Bio
Ayoub Belhadji is currently a Postdoc in Laboratoire de l’Informatique du Parallélisme (LIP) at ENS de Lyon. He received his Ph.D. degree from Ecole Centrale de Lille in 2020, specializing in theoretical signal processing and machine learning. His main research interests include sampling, kernel methods and sketching methods.