The EMS Publishing House is now EMS Press and has its new home at

Please find all EMS Press journals and articles on the new platform.

Mathematical Statistics and Learning

Full-Text PDF (660 KB) | Metadata | Table of Contents | MSL summary
Volume 3, Issue 2, 2020, pp. 113–164
DOI: 10.4171/MSL/20

Published online: 2021-08-20

Compressive statistical learning with random feature moments

Rémi Gribonval[1], Gilles Blanchard[2], Nicolas Keriven[3] and Yann Traonmilin[4]

(1) Université de Lyon, France
(2) Université Paris-Saclay, Orsay, France
(3) CNRS, GIPSA-lab, UMR 5216, Saint-Martin-d’Hères, France
(4) CNRS, Université de Bordeaux, Talence, France

We describe a general framework — $compressive$ $statistical$ $learning$ — for resource-efficient large-scale learning: the training collection is compressed in one pass into a low-dimensional $sketch$ (a vector of random empirical generalized moments) that captures the information relevant to the considered learning task. A near-minimizer of the risk is computed from the sketch through the solution of a nonlinear least squares problem. We investigate sufficient sketch sizes to control the generalization error of this procedure. The framework is illustrated on compressive PCA, compressive clustering, and compressive Gaussian mixture Modeling with fixed known variance. The latter two are further developed in a companion paper.

Keywords: Kernel mean embedding, random features, random moments, statistical learning, dimension reduction

Gribonval Rémi, Blanchard Gilles, Keriven Nicolas, Traonmilin Yann: Compressive statistical learning with random feature moments. Math. Stat. Learn. 3 (2020), 113-164. doi: 10.4171/MSL/20