Explaining dimensionality reduction results using Shapley values
Abstract
Dimensionality reduction (DR) techniques have been consistently supporting high-dimensional data analysis in various applications. Besides the patterns uncovered by these techniques, the interpretation of DR results based on each feature's contribution to the low-dimensional representation supports new finds through exploratory analysis. Current literature approaches designed to interpret DR techniques do not explain the features' contributions well since they focus only on the low-dimensional representation or do not consider the relationship among features. This paper presents ClusterShapley to address these problems, using Shapley values to generate explanations of dimensionality reduction techniques and interpret these algorithms using a cluster-oriented analysis. ClusterShapley explains the formation of clusters and the meaning of their relationship, which is useful for exploratory data analysis in various domains. We propose novel visualization techniques to guide the interpretation of features' contributions on clustering formation and validate our methodology through case studies of publicly available datasets. The results demonstrate our approach's interpretability and analysis power to generate insights about pathologies and patients in different conditions using DR results.
- Publication:
-
arXiv e-prints
- Pub Date:
- March 2021
- DOI:
- 10.48550/arXiv.2103.05678
- arXiv:
- arXiv:2103.05678
- Bibcode:
- 2021arXiv210305678E
- Keywords:
-
- Computer Science - Machine Learning;
- Computer Science - Graphics