Adaptive Reduced Rank Regression
Abstract
We study the low rank regression problem $\my = M\mx + \epsilon$, where $\mx$ and $\my$ are $d_1$ and $d_2$ dimensional vectors respectively. We consider the extreme highdimensional setting where the number of observations $n$ is less than $d_1 + d_2$. Existing algorithms are designed for settings where $n$ is typically as large as $\Rank(M)(d_1+d_2)$. This work provides an efficient algorithm which only involves two SVD, and establishes statistical guarantees on its performance. The algorithm decouples the problem by first estimating the precision matrix of the features, and then solving the matrix denoising problem. To complement the upper bound, we introduce new techniques for establishing lower bounds on the performance of any algorithm for this problem. Our preliminary experiments confirm that our algorithm often outperforms existing baselines, and is always at least competitive.
 Publication:

arXiv eprints
 Pub Date:
 May 2019
 arXiv:
 arXiv:1905.11566
 Bibcode:
 2019arXiv190511566W
 Keywords:

 Computer Science  Data Structures and Algorithms;
 Computer Science  Machine Learning
 EPrint:
 36 pages