The EMS Publishing House is now EMS Press and has its new home at ems.press.

Please find all EMS Press journals and articles on the new platform.

Mathematical Statistics and Learning


Full-Text PDF (801 KB) | Metadata | Table of Contents | MSL summary
Volume 3, Issue 2, 2020, pp. 165–257
DOI: 10.4171/MSL/21

Published online: 2021-08-20

Statistical learning guarantees for compressive clustering and compressive mixture modeling

Rémi Gribonval[1], Gilles Blanchard[2], Nicolas Keriven[3] and Yann Traonmilin[4]

(1) Université de Lyon, France
(2) Université Paris-Saclay, Orsay, France
(3) CNRS, GIPSA-lab, UMR 5216, Saint-Martin-d’Hères, France
(4) CNRS, Université de Bordeaux, Talence, France

We provide statistical learning guarantees for two unsupervised learning tasks in the context of $compressive$ $statistical$ $learning$, a general framework for resource-efficient large-scale learning that we introduced in a companion paper. The principle of compressive statistical learning is to compress a training collection, in one pass, into a low-dimensional $sketch$ (a vector of random empirical generalized moments) that captures the information relevant to the considered learning task. We explicitly describe and analyze random feature functions which empirical averages preserve the needed information for $compressive$ $clustering$ and $compressive$ $Gaussian$ $mixture$ $modeling$ with fixed known variance, and establish sufficient sketch sizes given the problem dimensions.

Keywords: Kernel mean embedding, random features, random moments, statistical learning, unsupervised learning, clustering, mixture modeling

Gribonval Rémi, Blanchard Gilles, Keriven Nicolas, Traonmilin Yann: Statistical learning guarantees for compressive clustering and compressive mixture modeling. Math. Stat. Learn. 3 (2020), 165-257. doi: 10.4171/MSL/21