Skip to main content
Log in

Null space conditions and thresholds for rank minimization

  • Full Length Paper
  • Series B
  • Published:
Mathematical Programming Submit manuscript

Abstract

Minimizing the rank of a matrix subject to constraints is a challenging problem that arises in many applications in machine learning, control theory, and discrete geometry. This class of optimization problems, known as rank minimization, is NP-hard, and for most practical problems there are no efficient algorithms that yield exact solutions. A popular heuristic replaces the rank function with the nuclear norm—equal to the sum of the singular values—of the decision variable and has been shown to provide the optimal low rank solution in a variety of scenarios. In this paper, we assess the practical performance of this heuristic for finding the minimum rank matrix subject to linear equality constraints. We characterize properties of the null space of the linear operator defining the constraint set that are necessary and sufficient for the heuristic to succeed. We then analyze linear constraints sampled uniformly at random, and obtain dimension-free bounds under which our null space properties hold almost surely as the matrix dimensions tend to infinity. Finally, we provide empirical evidence that these probabilistic bounds provide accurate predictions of the heuristic’s performance in non-asymptotic scenarios.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Ames, B.P.W., Vavasis, S.A.: Nuclear norm minimization for the planted clique and biclique problems (2009). Submitted to Mathematical Programming. Preprint available at http://arxiv.org/abs/0901.3348v1

  2. Amit, Y., Fink, M., Srebro, N., Ullman, S.: Uncovering shared structures in multiclass classification. In: Proceedings of the International Conference of Machine Learning (2007)

  3. Argyriou, A., Micchelli, C.A., Pontil, M.: Convex multi-task feature learning. Machine Learning (2008). Published online first at http://www.springerlink.com/

  4. Bai Z.D.: Methodologies in spectral analysis of large dimensional random matrices. Statistica Sinica 9(3), 611–661 (1999)

    MATH  MathSciNet  Google Scholar 

  5. Baraniuk, R., Davenport, M., DeVore, R., Wakin, M.: A simple proof of the restricted isometry property for random matrices. Constructive Approximation (2008). To Appear. Preprint available at http://dsp.rice.edu/cs/jlcs-v03.pdf

  6. Beck, C., D’Andrea, R.: Computational study and comparisons of LFT reducibility methods. In: Proceedings of the American Control Conference (1998)

  7. Cai J.F., Candès E.J., Shen Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2008)

    Article  Google Scholar 

  8. Candès E., Recht B.: Exact matrix completion via convex optimization. Found. Comput. Math. 9(6), 717–772 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  9. Candès E.J., Romberg J., Tao T.: Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inf. Theory 52(2), 489–509 (2006)

    Article  Google Scholar 

  10. Candès E.J., Tao T.: Decoding by linear programming. IEEE Trans. Inf. Theory 51(12), 4203–4215 (2005)

    Article  Google Scholar 

  11. Donoho D.: High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension. Discret. Comput. Geom. 35(4), 617–652 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  12. Donoho D., Huo X.: Uncertainty principles and ideal atomic decomposition. IEEE Trans. Inf. Theory 47(7), 2845–2862 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  13. Donoho D.L., Tanner J.: Neighborliness of randomly projected simplices in high dimensions. Proc. Natl. Acad. Sci. USA 102(27), 9452–9457 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  14. Donoho D.L., Tanner J.: Sparse nonnegative solution of underdetermined linear equations by linear programming. Proc. Natl. Acad. Sci. USA 102(27), 9446–9451 (2005)

    Article  MathSciNet  Google Scholar 

  15. Fazel, M.: Matrix Rank Minimization with Applications. Ph.D. thesis, Stanford University (2002)

  16. Fazel, M., Hindi, H., Boyd, S.: A rank minimization heuristic with application to minimum order system approximation. In: Proceedings of the American Control Conference (2001)

  17. El Ghaoui, L., Gahinet, P.: Rank minimization under LMI constraints: a framework for output feedback problems. In: Proceedings of the European Control Conference (1993)

  18. Gordon Y.: Some inequalities for Gaussian processes and applications. Israel J. Math. 50, 265–289 (1985)

    Article  MATH  MathSciNet  Google Scholar 

  19. Gordon Y.: Gaussian processes and almost spherical sections of convex bodies. Ann. Probab. 16, 180–188 (1988)

    Article  MATH  MathSciNet  Google Scholar 

  20. Ledoux M., Talagrand M.: Probability in Banach Spaces. Springer, Berlin (1991)

    MATH  Google Scholar 

  21. Lee, K., Bresler, Y.: Efficient and guaranteed rank minimization by atomic decomposition. In: IEEE International Symposium on Information Theory (2009)

  22. Linial N., London E., Rabinovich Y.: The geometry of graphs and some of its algorithmic applications. Combinatorica 15, 215–245 (1995)

    Article  MATH  MathSciNet  Google Scholar 

  23. Liu Z., Vandenberghe L.: Interior-point method for nuclear norm approximation with application to system identification. SIAM J. Matrix Anal. Appl. 31(3), 1235–1256 (2009)

    Article  MathSciNet  Google Scholar 

  24. Ma, S., Goldfarb, D., Chen, L.: Fixed point and Bregman iterative methods for matrix rank minimization (2008). Preprint available at http://www.optimization-online.org/DB_HTML/2008/11/2151.html

  25. Marčenko V.A., Pastur L.A.: Distributions of eigenvalues for some sets of random matrices. Math. USSR-Sbornik 1, 457–483 (1967)

    Article  Google Scholar 

  26. Meka, R., Jain, P., Caramanis, C., Dhillon, I.S.: Rank minimization via online learning. In: Proceedings of the International Conference on Machine Learning (2008)

  27. Mesbahi M., Papavassilopoulos G.P.: On the rank minimization problem over a positive semidefinite linear matrix inequality. IEEE Trans. Autom. Control 42(2), 239–243 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  28. Parrilo P.A., Khatri S.: On cone-invariant linear matrix inequalities. IEEE Trans. Automat. Control 45(8), 1558–1563 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  29. Recht B., Fazel M., Parrilo P.: Guaranteed minimum rank solutions of matrix equations via nuclear norm minimization. SIAM Rev. 52(3), 471–501 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  30. Recht, B., Xu, W., Hassibi, B.: Necessary and sufficient conditions for success of the nuclear norm heuristic for rank minimization. In: Proceedings of the 47th IEEE Conference on Decision and Control (2008)

  31. Rennie, J.D.M., Srebro, N.: Fast maximum margin matrix factorization for collaborative prediction. In: Proceedings of the International Conference of Machine Learning (2005)

  32. Slepian D.: The one-sided barrier problem for Gaussian noise. Bell Syst. Tech. J. 41, 463–501 (1962)

    MathSciNet  Google Scholar 

  33. Stojnic, M., Xu, W., Hassibi, B.: Compressed sensing - probabilistic analysis of a null-space characterization. In: IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (2008)

  34. Sturm J.F.: Using SeDuMi 1.02, a MATLAB toolbox for optimization over symmetric cones. Optim. Methods Softw. 11–12, 625–653 (1999)

    Article  MathSciNet  Google Scholar 

  35. Szarek, S.J.: Metric entropy of homogeneous spaces. In: Quantum probability (Gdańsk, 1997), Banach Center Publ., vol. 43, pp. 395–410. Polish Acad. Sci., Warsaw (1998). Preprint available at arXiv:math/ 9701213v1

  36. Weinberger K.Q., Saul L.K.: Unsupervised learning of image manifolds by semidefinite programming. Int. J. Comput. Vis. 70(1), 77–90 (2006)

    Article  Google Scholar 

  37. Yuan M., Ekici A., Lu Z., Monteiro R.: Dimension reduction and coefficient estimation in multivariate linear regression. J. Roy. Stat. Soc. Ser. B 69, 329–346 (2007)

    Article  MathSciNet  Google Scholar 

  38. Zhang, Y.: A simple proof for recoverability of 1 minimization: go over or under? Tech. Rep. TR05-09, Rice CAAM Department (2005)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Benjamin Recht.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Recht, B., Xu, W. & Hassibi, B. Null space conditions and thresholds for rank minimization. Math. Program. 127, 175–202 (2011). https://doi.org/10.1007/s10107-010-0422-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10107-010-0422-2

Keywords

Mathematics Subject Classification (2000)

Navigation