Canonical foliations of neural networks: application to robustness - ENAC - École nationale de l'aviation civile Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2022

Canonical foliations of neural networks: application to robustness

Résumé

Deep learning models are known to be vulnerable to adversarial attacks. Adversarial learning is therefore becoming a crucial task. We propose a new vision on neural network robustness using Riemannian geometry and foliation theory. The idea is illustrated by creating a new adversarial attack that takes into account the curvature of the data space. This new adversarial attack called the two-step spectral attack is a piece-wise linear approximation of a geodesic in the data space. The data space is treated as a (degenerate) Riemannian manifold equipped with the pullback of the Fisher Information Metric (FIM) of the neural network. In most cases, this metric is only semi-definite and its kernel becomes a central object to study. A canonical foliation is derived from this kernel. The curvature of transverse leaves gives the appropriate correction to get a two-step approximation of the geodesic and hence a new efficient adversarial attack. The method is first illustrated on a 2D toy example in order to visualize the neural network foliation and the corresponding attacks. Next, experiments on the MNIST dataset with the proposed technique and a state of the art attack presented in Zhao et al. (2019) are reported. The result show that the proposed attack is more efficient at all levels of available budget for the attack (norm of the attack), confirming that the curvature of the transverse neural network FIM foliation plays an important role in the robustness of neural networks.
Fichier principal
Vignette du fichier
main.pdf (1.27 Mo) Télécharger le fichier
foliation_50neur_or.pdf (84.36 Ko) Télécharger le fichier
foliation_50neur_xor.pdf (227.5 Ko) Télécharger le fichier
fooling_rates_compared_nsample=10000_start=0_batch=0_good_label.pdf (15.33 Ko) Télécharger le fichier
fooling_rates_compared_nsample=5000_start=0_nl=Sigmoid_batch=0.pdf (14.62 Ko) Télécharger le fichier
fooling_rates_compared_nsample=5000_start=0_nl=Sigmoid_batch=0I=_0.4,0.6.pdf (14.92 Ko) Télécharger le fichier
fooling_rates_increase_with_pictures_zones.pdf (23.59 Ko) Télécharger le fichier
inf_norm_compared_nsample=10000_StandardTwoStepSpectralAttack_worst-case_budget=3.0.pdf (25.52 Ko) Télécharger le fichier
inf_norm_compared_nsample=10000_StandardTwoStepSpectralAttack_worst-case_budget=7.0.pdf (25.89 Ko) Télécharger le fichier
one-step-vs-geodesic.pdf (2.07 Ko) Télécharger le fichier
one-step-vs-geodesic.pdf_tex (2.82 Ko) Télécharger le fichier
one-step.pdf (1.31 Ko) Télécharger le fichier
one-step.pdf_tex (2.58 Ko) Télécharger le fichier
pdflatex780899.fls (284 B) Télécharger le fichier
plot_attacks_2D_budget=1e-1_nsample=100_nl=Sigmoid.pdf (53.29 Ko) Télécharger le fichier
plot_curvature_nl=Sigmoid.pdf (298.83 Ko) Télécharger le fichier
two-step.pdf (1.6 Ko) Télécharger le fichier
two-step.pdf_tex (2.86 Ko) Télécharger le fichier
two-step_triangular.pdf (2.13 Ko) Télécharger le fichier
two-step_triangular.pdf_tex (3.14 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03593479 , version 1 (02-03-2022)
hal-03593479 , version 2 (13-06-2023)

Identifiants

Citer

Eliot Tron, Nicolas Couellan, Stéphane Puechmorel. Canonical foliations of neural networks: application to robustness. 2022. ⟨hal-03593479v2⟩
131 Consultations
99 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More