Metric entropy of causal, discrete-time LTI systems
In  it is shown that recurrent neural networks (RNNs) can learn - in a metric entropy optimal manner - discrete time, linear time-invariant (LTI) systems. This is effected by comparing the number of bits needed to encode the approximating RNN to the metric entropy of the class of LTI systems under consideration [2, 3]. The purpose of this note is to provide an elementary self-contained proof of the metric entropy results in [2, 3], in the process of which minor mathematical issues appearing in [2, 3] are cleaned up. These corrections also lead to the correction of a constant in a result in  (see Remark 2.5).
- Pub Date:
- November 2022
- Mathematics - Dynamical Systems;
- Computer Science - Information Theory
-  arXiv:2105.02556