Adaptive quantile lowrank matrix factorization
Abstract
Lowrank matrix factorization (LRMF) has received much popularity owing to its successful applications in both computer vision and data mining. By assuming noise to come from a Gaussian, Laplace or mixture of Gaussian distributions, significant efforts have been made on optimizing the (weighted) L_{1} or L_{2}norm loss between an observed matrix and its bilinear factorization. However, the type of noise distribution is generally unknown in real applications and inappropriate assumptions will inevitably deteriorate the behavior of LRMF. On the other hand, real data are often corrupted by skew rather than symmetric noise. To tackle this problem, this paper presents a novel LRMF model called AQLRMF by modeling noise with a mixture of asymmetric Laplace distributions. An efficient algorithm based on the expectationmaximization (EM) algorithm is also offered to estimate the parameters involved in AQLRMF. The AQLRMF model possesses the advantage that it can approximate noise well no matter whether the real noise is symmetric or skew. The core idea of AQLRMF lies in solving a weighted L_{1} problem with weights being learned from data. The experiments conducted on synthetic and real data sets show that AQLRMF outperforms several stateoftheart techniques. Furthermore, AQLRMF also has the superiority over the other algorithms in terms of capturing local structural information contained in real images.
 Publication:

Pattern Recognition
 Pub Date:
 July 2020
 DOI:
 10.1016/j.patcog.2020.107310
 arXiv:
 arXiv:1901.00140
 Bibcode:
 2020PatRe.10307310X
 Keywords:

 Lowrank matrix factorization;
 Mixture of asymmetric Laplace distributions;
 Expectation maximization algorithm;
 Skew noise;
 Statistics  Machine Learning;
 Computer Science  Machine Learning
 EPrint:
 Pattern Recognition, Volume 103, July 2020, 107310