Overcoming the curse of dimensionality with Laplacian regularization in semi-supervised learning - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Overcoming the curse of dimensionality with Laplacian regularization in semi-supervised learning

Résumé

As annotations of data can be scarce in large-scale practical problems, leveraging unlabelled examples is one of the most important aspects of machine learning. This is the aim of semi-supervised learning. To benefit from the access to unlabelled data, it is natural to diffuse smoothly knowledge of labelled data to unlabelled one. This induces to the use of Laplacian regularization. Yet, current implementations of Laplacian regularization suffer from several drawbacks, notably the well-known curse of dimensionality. In this paper, we provide a statistical analysis to overcome those issues, and unveil a large body of spectral filtering methods that exhibit desirable behaviors. They are implemented through (reproducing) kernel methods, for which we provide realistic computational guidelines in order to make our method usable with large amounts of data.
Fichier principal
Vignette du fichier
main.pdf (2.31 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03454809 , version 1 (29-11-2021)

Identifiants

  • HAL Id : hal-03454809 , version 1

Citer

Vivien Cabannes, Loucas Pillaud-Vivien, Francis Bach, Alessandro Rudi. Overcoming the curse of dimensionality with Laplacian regularization in semi-supervised learning. NeurIPS 2021 - Thirty-fifth conference on Neural Information Processing Systems (NeurIPS), Dec 2021, Online, Unknown Region. ⟨hal-03454809⟩
57 Consultations
87 Téléchargements

Partager

Gmail Facebook X LinkedIn More