Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Using Wasserstein-2 regularization to ensure fair decisions with Neural-Network classifiers

Abstract : In this paper, we propose a new method to build fair Neural-Network classifiers by using a constraint based on the Wasserstein distance. More specifically, we detail how to efficiently compute the gradients of Wasserstein-2 regularizers for Neural-Networks. The proposed strategy is then used to train Neural-Networks decision rules which favor fair predictions. Our method fully takes into account two specificities of Neural-Networks training: (1) The network parameters are indirectly learned based on automatic differentiation and on the loss gradients, and (2) batch training is the gold standard to approximate the parameter gradients, as it requires a reasonable amount of computations and it can efficiently explore the parameters space. Results are shown on synthetic data, as well as on the UCI Adult Income Dataset. Our method is shown to perform well compared with 'ZafarICWWW17' and linear-regression with Wasserstein-1 regularization, as in 'JiangUAI19', in particular when non-linear decision rules are required for accurate predictions.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

https://hal-enac.archives-ouvertes.fr/hal-02271117
Contributor : Laurence Porte <>
Submitted on : Monday, August 26, 2019 - 3:30:31 PM
Last modification on : Tuesday, October 20, 2020 - 10:32:07 AM

Links full text

Identifiers

  • HAL Id : hal-02271117, version 1
  • ARXIV : 1908.05783

Citation

Laurent Risser, Quentin Vincenot, Nicolas Couellan, Jean-Michel Loubes. Using Wasserstein-2 regularization to ensure fair decisions with Neural-Network classifiers. 2019. ⟨hal-02271117⟩

Share

Metrics

Record views

65