Statistical learning guarantees for compressive clustering and compressive mixture modeling

  • Rémi Gribonval

    Université de Lyon, France
  • Gilles Blanchard

    Université Paris-Saclay, Orsay, France
  • Nicolas Keriven

    CNRS, GIPSA-lab, UMR 5216, Saint-Martin-d’Hères, France
  • Yann Traonmilin

    CNRS, Université de Bordeaux, Talence, France
Statistical learning guarantees for compressive clustering and compressive mixture modeling cover

A subscription is required to access this article.

Abstract

We provide statistical learning guarantees for two unsupervised learning tasks in the context of , a general framework for resource-efficient large-scale learning that we introduced in a companion paper. The principle of compressive statistical learning is to compress a training collection, in one pass, into a low-dimensional (a vector of random empirical generalized moments) that captures the information relevant to the considered learning task. We explicitly describe and analyze random feature functions which empirical averages preserve the needed information for and with fixed known variance, and establish sufficient sketch sizes given the problem dimensions.

Cite this article

Rémi Gribonval, Gilles Blanchard, Nicolas Keriven, Yann Traonmilin, Statistical learning guarantees for compressive clustering and compressive mixture modeling. Math. Stat. Learn. 3 (2020), no. 2, pp. 165–257

DOI 10.4171/MSL/21