Accéder directement au contenu Accéder directement à la navigation
Article dans une revue

Probabilistic Robustness Estimates for Feed-forward Neural Networks

Abstract : Robustness of deep neural networks is a critical issue in practical applications. In the general case of feed-forward neural networks (including convolutional deep neural network architectures), under random noise attacks, we propose to study the probability that the output of the network deviates from its nominal value by a given threshold. We derive a simple concentration inequality for the propagation of the input uncertainty through the network using the Cramer-Chernoff method and estimates of the local variation of the neural network mapping computed at the training points. We further discuss and exploit the resulting condition on the network to regularize the loss function during training. Finally, we assess the proposed tail probability estimates empirically on various public datasets and show that the observed robustness is very well estimated by the proposed method.
Liste complète des métadonnées

https://hal-enac.archives-ouvertes.fr/hal-03213024
Contributeur : Nicolas COUELLAN Connectez-vous pour contacter le contributeur
Soumis le : vendredi 30 avril 2021 - 10:20:46
Dernière modification le : lundi 4 juillet 2022 - 08:38:13
Archivage à long terme le : : samedi 31 juillet 2021 - 18:22:29

Fichier

ProbEstimNN.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

Citation

Nicolas Couellan. Probabilistic Robustness Estimates for Feed-forward Neural Networks. Neural Networks, Elsevier, 2021, 142, pp.138-147. ⟨10.1016/j.neunet.2021.04.037⟩. ⟨hal-03213024⟩

Partager

Métriques

Consultations de la notice

100

Téléchargements de fichiers

167