Title
Remote Explainability Faces The Bouncer Problem
Abstract
The concept of explainability is envisioned to satisfy society's demands for transparency about machine learning decisions. The concept is simple: like humans, algorithms should explain the rationale behind their decisions so that their fairness can be assessed. Although this approach is promising in a local context (for example, the model creator explains it during debugging at the time of training), we argue that this reasoning cannot simply be transposed to a remote context, where a model trained by a service provider is only accessible to a user through a network and its application programming interface. This is problematic, as it constitutes precisely the target use case requiring transparency from a societal perspective. Through an analogy with a club bouncer (who may provide untruthful explanations upon customer rejection), we show that providing explanations cannot prevent a remote service from lying about the true reasons leading to its decisions. More precisely, we observe the impossibility of remote explainability for single explanations by constructing an attack on explanations that hides discriminatory features from the querying user. We provide an example implementation of this attack. We then show that the probability that an observer spots the attack, using several explanations for attempting to find incoherences, is low in practical settings. This undermines the very concept of remote explainability in general.When automated decisions are provided by a company without providing the full model, users and law makers might demand a 'right to an explanation'. Le Merrer and Tredan show that malicious manipulations of these explanations are hard to detect, even for simple strategies to obscure the model's decisions.
Year
DOI
Venue
2020
10.1038/s42256-020-0216-z
NATURE MACHINE INTELLIGENCE
DocType
Volume
Issue
Journal
2
9
Citations 
PageRank 
References 
0
0.34
0
Authors
2
Name
Order
Citations
PageRank
Erwan Le Merrer132223.58
Gilles Trédan210011.32