Skip to main content
Log in

Robust PCA and subspace tracking from incomplete observations using \(\ell _0\)-surrogates

  • Original Paper
  • Published:
Computational Statistics Aims and scope Submit manuscript

Abstract

Many applications in data analysis rely on the decomposition of a data matrix into a low-rank and a sparse component. Existing methods that tackle this task use the nuclear norm and \(\ell _1\)-cost functions as convex relaxations of the rank constraint and the sparsity measure, respectively, or employ thresholding techniques. We propose a method that allows for reconstructing and tracking a subspace of upper-bounded dimension from incomplete and corrupted observations. It does not require any a priori information about the number of outliers. The core of our algorithm is an intrinsic Conjugate Gradient method on the set of orthogonal projection matrices, the so-called Grassmannian. Non-convex sparsity measures are used for outlier detection, which leads to improved performance in terms of robustly recovering and tracking the low-rank matrix. In particular, our approach can cope with more outliers and with an underlying matrix of higher rank than other state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  • Absil PA, Mahony R, Sepulchre R (2008) Optimization algorithms on matrix manifolds. Princeton University Press, Princeton

    MATH  Google Scholar 

  • Balzano L, Nowak R, Recht B (2010) Online identification and tracking of subspaces from highly incomplete information. In: 2010 48th Annual allerton conference on communication, control, and computing, pp 704–711

  • Boumal N, Absil PA (2011) RTRMC: a Riemannian trust-region method for low-rank matrix completion. In: Shawe-Taylor J, Zemel RS, Bartlett P, Pereira F, Weinberger KQ (eds) Advances in neural information processing systems, pp 406–414

  • Cai J, Candès EJ, Shen Z (2010) A singular value thresholding algorithm for matrix completion. SIAM J Optim 20:1956–1982

    Article  MATH  MathSciNet  Google Scholar 

  • Candès E, Li X, Ma Y, Wright J (2011) Robust principal component analysis? J ACM 58(3):1–37

    Article  Google Scholar 

  • Chartrand R, Staneva V (2008) Restricted isometry properties and nonconvex compressive sensing. Inverse Probl 24(3):1–14

    Article  MathSciNet  Google Scholar 

  • Chen Y, Xu H, Caramanis C, Sanghavi S (2011) Robust matrix completion and corrupted columns. In: International conference on machine learning, vol 2, pp 873–880

  • Ding C, Zhou D, He X, Zha H (2006) R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization. In: 23rd International conference on Machine learning. ACM, New York, pp 281–288

  • Gasso G, Rakotomamonjy A, Canu S (2009) Recovering sparse signals with a certain family of nonconvex penalties and DC programming. IEEE Trans Signal Process 57(12):4686–4698

    Article  MathSciNet  Google Scholar 

  • Golub H, Van Loan CF (1996) Matrix computations. Johns Hopkins University Press, Baltimore

    MATH  Google Scholar 

  • He J, Balzano L, Szlam A (2012) Incremental gradient on the Grassmannian for online foreground and background separation in subsampled video. In: IEEE conference on computer vision and pattern recognition, pp 1568–1575

  • Helmke U, Hüper K, Trumpf J (2007) Newton’s method on Grassmann manifolds. ArXiv e-prints 0709.2205

  • Keshavan RH, Montanari A (2010) Matrix completion from noisy entries. J Mach Learn Res 11:2057–2078

    MATH  MathSciNet  Google Scholar 

  • Kleinsteuber M, Hüper K (2007) An intrinsic CG algorithm for computing dominant subspaces. In: IEEE international conference on acoustics, speech and signal processing, vol 7, pp 1405–1408

  • Kwak N (2008) Principal component analysis based on l1-norm maximization. IEEE Trans Pattern Anal Mach Intell 30(9):1672–1680

    Article  Google Scholar 

  • Li L, Huang W, Gu IYH, Tian Q (2004) Statistical modeling of complex backgrounds for foreground object detection. IEEE Trans Image Process 13(11):1459–1472

    Article  Google Scholar 

  • Lin Z, Chen M, Wu L, Ma Y (2010) The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices. Arxiv, preprint arXiv:10095055 arXiv:1009.5055v2

  • Meyer G, Bonnabel S, Sepulchre R (2011) Linear regression under fixed-rank constraints: a Riemannian approach. In: International conference on machine learning, pp 545–552

  • Ring W, Wirth B (2012) Optimization methods on Riemannian manifolds and their application to shape space. SIAM J Optim 22(2):596–627

    Article  MATH  MathSciNet  Google Scholar 

  • Shalev-Shwartz S, Gonen A, Shamir O (2011) Large-scale convex minimization with a low-rank constraint. In: International conference on machine learning, pp 329–336

  • Shalit U, Weinshall D, Chechik G (2010) Online learning in the manifold of low-rank matrices. Adv Neural Inf Process Syst 23:2128–2136

    Google Scholar 

  • Shen Y, Wen Z, Zhang Y (2012) Augmented Lagrangian alternating direction method for matrix separation based on low-rank factorization. Optim Methods Softw 1–25

  • Waters AE, Sankaranarayanan AC, Baraniuk RG (2011) SpaRCS: recovering low-rank and sparse matrices from compressive measurements. In: Shawe-Taylor J, Zemel RS, Bartlett P, Pereira F, Weinberger KQ (eds) Advances in neural information processing systems, pp 1089–1097

  • Wright J, Ganesh A, Rao S, Peng Y, Ma Y (2009) Robust principal component analysis: exact recovery of corrupted low-rank matrices via convex optimization. In: Bengio Y, Schuurmans D, Lafferty J, Williams CKI, Culotta A (eds) Advances in neural information processing systems, pp 2080–2088

  • Zhou T, Tao D (2011) GoDec: randomized low-rank & sparse matrix decomposition in noisy case. In: International conference on machine learning, pp 33–40

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Clemens Hage.

Additional information

This work has been supported by the DFG excellence initiative research cluster CoTeSys.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hage, C., Kleinsteuber, M. Robust PCA and subspace tracking from incomplete observations using \(\ell _0\)-surrogates. Comput Stat 29, 467–487 (2014). https://doi.org/10.1007/s00180-013-0435-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00180-013-0435-4

Keywords

Navigation