Probabilistic Robustness Estimates for Deep Neural Networks - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année :

Probabilistic Robustness Estimates for Deep Neural Networks

(1, 2)
1
2

Résumé

Robustness of deep neural networks is a critical issue in practical applications. In the case of deep dense neural networks, under random noise attacks, we propose to study the probability that the output of the network deviates from its nominal value by a given threshold. We derive a simple concentration inequality for the propagation of the input uncertainty through the network using the Cramer-Chernoff method and estimates of the local variation of the neural network mapping computed at the training points. We further discuss and exploit the resulting condition on the network to regularize the loss function during training. Finally, we assess the proposed tail probability estimate empirically on three public regression datasets and show that the observed robustness is very well estimated by the proposed method.
Fichier principal
Vignette du fichier
UDL2020-paper-050.pdf (528.07 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02572277 , version 1 (13-05-2020)
hal-02572277 , version 2 (29-09-2020)

Identifiants

  • HAL Id : hal-02572277 , version 2

Citer

Nicolas Couellan. Probabilistic Robustness Estimates for Deep Neural Networks. ICML workshop on Uncertainty and Robustness in Deep Learning, International Conference on Machine Learning (ICML), Jul 2020, Virtual Conference, United States. ⟨hal-02572277v2⟩
276 Consultations
402 Téléchargements

Partager

Gmail Facebook Twitter LinkedIn More