Sampling Permutations for Shapley Value Estimation
Abstract
Gametheoretic attribution techniques based on Shapley values are used to interpret blackbox machine learning models, but their exact calculation is generally NPhard, requiring approximation methods for nontrivial models. As the computation of Shapley values can be expressed as a summation over a set of permutations, a common approach is to sample a subset of these permutations for approximation. Unfortunately, standard Monte Carlo sampling methods can exhibit slow convergence, and more sophisticated quasiMonte Carlo methods have not yet been applied to the space of permutations. To address this, we investigate new approaches based on two classes of approximation methods and compare them empirically. First, we demonstrate quadrature techniques in a RKHS containing functions of permutations, using the Mallows kernel in combination with kernel herding and sequential Bayesian quadrature. The RKHS perspective also leads to quasiMonte Carlo type error bounds, with a tractable discrepancy measure defined on permutations. Second, we exploit connections between the hypersphere $\mathbb{S}^{d2}$ and permutations to create practical algorithms for generating permutation samples with good properties. Experiments show the above techniques provide significant improvements for Shapley value estimates over existing methods, converging to a smaller RMSE in the same number of model evaluations.
 Publication:

arXiv eprints
 Pub Date:
 April 2021
 arXiv:
 arXiv:2104.12199
 Bibcode:
 2021arXiv210412199M
 Keywords:

 Statistics  Machine Learning;
 Computer Science  Machine Learning;
 Mathematics  Combinatorics;
 05A05 (Primary) 65K10;
 90C27 (Secondary);
 I.2.6;
 G.2.1
 EPrint:
 33 pages, 13 figures