Seq2Tens: An Efficient Representation of Sequences by LowRank Tensor Projections
Abstract
Sequential data such as time series, video, or text can be challenging to analyse as the ordered structure gives rise to complex dependencies. At the heart of this is noncommutativity, in the sense that reordering the elements of a sequence can completely change its meaning. We use a classical mathematical object  the tensor algebra  to capture such dependencies. To address the innate computational complexity of high degree tensors, we use compositions of lowrank tensor projections. This yields modular and scalable building blocks for neural networks that give stateoftheart performance on standard benchmarks such as multivariate time series classification and generative models for video.
 Publication:

arXiv eprints
 Pub Date:
 June 2020
 arXiv:
 arXiv:2006.07027
 Bibcode:
 2020arXiv200607027T
 Keywords:

 Computer Science  Machine Learning;
 Statistics  Machine Learning
 EPrint:
 37 pages, 6 figures, 8 tables