Global universal approximation of functional input maps on weighted spaces
Abstract
We introduce socalled functional input neural networks defined on a possibly infinite dimensional weighted space with values also in a possibly infinite dimensional output space. To this end, we use an additive family as hidden layer maps and a nonlinear activation function applied to each hidden layer. Relying on StoneWeierstrass theorems on weighted spaces, we can prove a global universal approximation result for generalizations of continuous functions going beyond the usual approximation on compact sets. This then applies in particular to approximation of (nonanticipative) path space functionals via functional input neural networks. As a further application of the weighted StoneWeierstrass theorem we prove a global universal approximation result for linear functions of the signature. We also introduce the viewpoint of Gaussian process regression in this setting and show that the reproducing kernel Hilbert space of the signature kernels are CameronMartin spaces of certain Gaussian processes. This paves the way towards uncertainty quantification for signature kernel regression.
 Publication:

arXiv eprints
 Pub Date:
 June 2023
 DOI:
 10.48550/arXiv.2306.03303
 arXiv:
 arXiv:2306.03303
 Bibcode:
 2023arXiv230603303C
 Keywords:

 Statistics  Machine Learning;
 Computer Science  Machine Learning;
 Mathematics  Functional Analysis;
 Mathematics  Probability;
 Quantitative Finance  Mathematical Finance;
 26A16;
 26E20;
 41A65;
 41A81;
 46E40;
 60L10;
 68T07
 EPrint:
 57 pages, 4 figures