Dimension reduction in recurrent networks by canonicalization
Abstract
Many recurrent neural network machine learning paradigms can be formulated using statespace representations. The classical notion of canonical statespace realization is adapted in this paper to accommodate semiinfinite inputs so that it can be used as a dimension reduction tool in the recurrent networks setup. The socalled input forgetting property is identified as the key hypothesis that guarantees the existence and uniqueness (up to system isomorphisms) of canonical realizations for causal and timeinvariant input/output systems with semiinfinite inputs. Additionally, the notion of optimal reduction coming from the theory of symmetric Hamiltonian systems is implemented in our setup to construct canonical realizations out of input forgetting but not necessarily canonical ones. These two procedures are studied in detail in the framework of linear fading memory input/output systems. Finally, the notion of implicit reduction using reproducing kernel Hilbert spaces (RKHS) is introduced which allows, for systems with linear readouts, to achieve dimension reduction without the need to actually compute the reduced spaces introduced in the first part of the paper.
 Publication:

arXiv eprints
 Pub Date:
 July 2020
 arXiv:
 arXiv:2007.12141
 Bibcode:
 2020arXiv200712141G
 Keywords:

 Mathematics  Optimization and Control;
 Computer Science  Machine Learning;
 Computer Science  Neural and Evolutionary Computing;
 Mathematics  Dynamical Systems
 EPrint:
 31 pages