Exploring predictive states via Cantor embeddings and Wasserstein distance
Abstract
Predictive states for stochastic processes are a nonparametric and interpretable construct with relevance across a multitude of modeling paradigms. Recent progress on the self-supervised reconstruction of predictive states from time-series data focused on the use of reproducing kernel Hilbert spaces. Here, we examine how Wasserstein distances may be used to detect predictive equivalences in symbolic data. We compute Wasserstein distances between distributions over sequences ("predictions") using a finite-dimensional embedding of sequences based on the Cantor set for the underlying geometry. We show that exploratory data analysis using the resulting geometry via hierarchical clustering and dimension reduction provides insight into the temporal structure of processes ranging from the relatively simple (e.g., generated by finite-state hidden Markov models) to the very complex (e.g., generated by infinite-state indexed grammars).
- Publication:
-
Chaos
- Pub Date:
- December 2022
- DOI:
- 10.1063/5.0102603
- arXiv:
- arXiv:2206.04198
- Bibcode:
- 2022Chaos..32l3115L
- Keywords:
-
- Condensed Matter - Statistical Mechanics;
- Computer Science - Machine Learning;
- Mathematics - Dynamical Systems;
- Statistics - Machine Learning
- E-Print:
- 9 pages, 4 figures