Accéder directement au contenu Accéder directement à la navigation

# Sketching Data Sets for Large-Scale Learning: Keeping only what you need

1 DANTE - Dynamic Networks : Temporal and Structural Capture Approach
Inria Grenoble - Rhône-Alpes, LIP - Laboratoire de l'Informatique du Parallélisme, IXXI - Institut Rhône-Alpin des systèmes complexes
2 PANAMA - Parcimonie et Nouveaux Algorithmes pour le Signal et la Modélisation Audio
Inria Rennes – Bretagne Atlantique , IRISA-D5 - SIGNAUX ET IMAGES NUMÉRIQUES, ROBOTIQUE
Abstract : Big data can be a blessing: with very large training datasets it becomes possible to perform complex learning tasks with unprecedented accuracy. Yet, this improved performance comes at the price of enormous computational challenges. Thus, one may wonder: Is it possible to leverage the information content of huge datasets while keeping computational resources under control? Can this also help solve some of the privacy issues raised by large-scale learning? This is the ambition of compressive learning, where the dataset is massively compressed before learning. Here, a sketch'' is first constructed by computing carefully chosen nonlinear random features (e.g. random Fourier features) and averaging them over the whole dataset. Parameters are then learned from the sketch, without access to the original dataset. This article surveys the current state-of-the-art in compressive learning, including the main concepts and algorithms; their connections with established signal-processing methods; existing theoretical guarantees, on both information preservation and privacy preservation; and important open problems. For an extended version of this article that contains additional references and more in-depth discussions on a variety of topics, see [1].
Type de document :
Article dans une revue

https://hal.inria.fr/hal-03350599
Contributeur : Rémi Gribonval Connectez-vous pour contacter le contributeur
Soumis le : mardi 21 septembre 2021 - 14:57:53
Dernière modification le : vendredi 8 octobre 2021 - 18:50:32

### Fichier

SPM_paper.pdf
Fichiers produits par l'(les) auteur(s)

### Citation

Rémi Gribonval, Antoine Chatalic, Nicolas Keriven, Vincent Schellekens, Laurent Jacques, et al.. Sketching Data Sets for Large-Scale Learning: Keeping only what you need. IEEE Signal Processing Magazine, Institute of Electrical and Electronics Engineers, 2021, 38 (5), pp.12-36. ⟨10.1109/MSP.2021.3092574⟩. ⟨hal-03350599⟩

### Métriques

Consultations de la notice

## 47

Téléchargements de fichiers