Sumofsquares meets square loss: Fast rates for agnostic tensor completion
Abstract
We study tensor completion in the agnostic setting. In the classical tensor completion problem, we receive $n$ entries of an unknown rank$r$ tensor and wish to exactly complete the remaining entries. In agnostic tensor completion, we make no assumption on the rank of the unknown tensor, but attempt to predict unknown entries as well as the best rank$r$ tensor. For agnostic learning of thirdorder tensors with the square loss, we give the first polynomial time algorithm that obtains a "fast" (i.e., $O(1/n)$type) rate improving over the rate obtained by reduction to matrix completion. Our prediction error rate to compete with the best $d\times{}d\times{}d$ tensor of rank$r$ is $\tilde{O}(r^{2}d^{3/2}/n)$. We also obtain an exact oracle inequality that trades off estimation and approximation error. Our algorithm is based on the degreesix sumofsquares relaxation of the tensor nuclear norm. The key feature of our analysis is to show that a certain characterization for the subgradient of the tensor nuclear norm can be encoded in the sumofsquares proof system. This unlocks the standard toolbox for localization of empirical processes under the square loss, and allows us to establish restricted eigenvaluetype guarantees for various tensor regression models, with tensor completion as a special case. The new analysis of the relaxation complements Barak and Moitra (2016), who gave slow rates for agnostic tensor completion, and Potechin and Steurer (2017), who gave exact recovery guarantees for the noiseless setting. Our techniques are userfriendly, and we anticipate that they will find use elsewhere.
 Publication:

arXiv eprints
 Pub Date:
 May 2019
 DOI:
 10.48550/arXiv.1905.13283
 arXiv:
 arXiv:1905.13283
 Bibcode:
 2019arXiv190513283F
 Keywords:

 Computer Science  Machine Learning;
 Computer Science  Data Structures and Algorithms;
 Mathematics  Statistics Theory;
 Statistics  Machine Learning
 EPrint:
 To appear at COLT 2019