Skip to main content
Log in

Variational Bayesian Inference for CP Tensor Completion with Subspace Information

  • Published:
Lobachevskii Journal of Mathematics Aims and scope Submit manuscript

Abstract

We propose an algorithm for Bayesian low-rank tensor completion with automatic rank determination in the canonical polyadic format when additional subspace information (SI) is given. We numerically validate the regularization properties induced by SI and present the results about tensor recovery and rank determination. The results show that the number of samples required for successful completion is significantly reduced in the presence of SI.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

REFERENCES

  1. D. L. Donoho, ‘‘Compressed sensing,’’ IEEE Trans. Inf. Theory 52, 1289–1306 (2006).

    Article  MathSciNet  MATH  Google Scholar 

  2. E. J. Candès and B. Recht, ‘‘Exact matrix completion via convex optimization,’’ Found. Comput. Math. 9, 717 (2009).

    Article  MathSciNet  MATH  Google Scholar 

  3. E. J. Candès and T. Tao, ‘‘The power of convex relaxation: Near-optimal matrix completion,’’ IEEE Trans. Inf. Theory 56, 2053–2080 (2010).

    Article  MathSciNet  MATH  Google Scholar 

  4. E. J. Candès, X. Li, Y. Ma, and J. Wright, ‘‘Robust principal component analysis?,’’ J. ACM 58 (3), 1–37 (2011).

    Article  MathSciNet  MATH  Google Scholar 

  5. H. Ma, H. Yang, M. R. Lyu, and I. King, ‘‘SoRec: Social recommendation using probabilistic matrix factorization,’’ in Proceedings of the Conference on Information and Knowledge Management CIKM 2008 (2008), pp. 931–940.

  6. H. Xie, C. Li, R. Y. D. Xu, and K. Mengersen, ‘‘Robust kernelized bayesian matrix factorization for video background/foreground separation,’’ in Machine Learning, Optimization, and Data Science, 5th International Conference, LOD 2019, Siena, Italy, September 10–13, 2019 (2019), pp. 484–495.

  7. N. Natarajan and I. S. Dhillon, ‘‘Inductive matrix completion for predicting gene–disease associations,’’ Bioinformatics 30 (12), 60–68 (2014).

    Article  Google Scholar 

  8. P. Zakeri, J. Simm, A. Arany, S. ElShal, and Y. Moreau, ‘‘Gene prioritization using Bayesian matrix factorization with genomic and phenotypic side information,’’ Bioinformatics 34 (13), 447–456 (2018).

    Article  Google Scholar 

  9. B. Güvenç Paltun, H. Mamitsuka, and S. Kaski, ‘‘Improving drug response prediction by integrating multiple data sources: Matrix factorization, kernel and network-based approaches,’’ Brief. Bioinform. 22, 346–359 (2021).

    Article  Google Scholar 

  10. P. Jain and I. S. Dhillon, ‘‘Provable inductive matrix completion,’’ arXiv: 1306.0626 (2013).

  11. M. Xu, R. Jin, and Z. Zhou, ‘‘Speedup matrix completion with side information: Application to multi-label learning,’’ in Proceedings of the Conference on Advances in Neural Information Processing Systems NIPS 2013 (2013), pp. 2301–2309.

  12. Y. Kim and S. Choi, ‘‘Scalable variational bayesian matrix factorization with side information,’’ in Proceedings of the AISTATS 2014 (2014), pp. 493–502.

    Google Scholar 

  13. K. Chiang, C. Hsieh, and I. S. Dhillon, ‘‘Matrix completion with noisy side information,’’ in Proceedings of the Conference on Advances in Neural Information Processing Systems NIPS 2015 (2015), pp. 3447–3455.

  14. K. Chiang, C. Hsieh, and I. S. Dhillon, ‘‘Robust principal component analysis with side information,’’ in Proceedings of the International Conference on Machine Learning ICML 2016 (2016), pp. 2291–2299.

  15. T. G. Kolda and B. W. Bader, ‘‘Tensor decompositions and applications,’’ SIAM Rev. 51, 455–500 (2009).

    Article  MathSciNet  MATH  Google Scholar 

  16. I. V. Oseledets, ‘‘Tensor-train decomposition,’’ SIAM J. Sci. Comput. 33, 2295–2317 (2011).

    Article  MathSciNet  MATH  Google Scholar 

  17. A. Cichocki, D. Mandic, L. de Lathauwer, G. Zhou, Q. Zhao, C. Caiafa, and H. A. Phan, ‘‘Tensor decompositions for signal processing applications: From two-way to multiway component analysis,’’ IEEE Signal Process. Mag. 32, 145–163 (2015).

    Article  Google Scholar 

  18. E. E. Papalexakis, C. Faloustos, and N. D. Sidiropoulos, ‘‘Tensors for data mining and data fusion: Models, applications, and scalable algorithms,’’ ACM Trans. Intell. Syst. Technol. 8 (2), 16:1–16:44 (2016).

  19. N. D. Sidiropoulos, L. de Lathauwer, X. Fu, K. Huang, E. E. Papalexakis, and C. Faloustos, ‘‘Tensor decomposition for signal processing and machine learning,’’ IEEE Trans. Signal Process. 65, 3551–3582 (2017).

    Article  MathSciNet  MATH  Google Scholar 

  20. M. Signoretto, L. de Lathauwer, and J. A. K. Suykens, ‘‘Nuclear norms for tensors and their use for convex multilinear estimation,’’ Linear Algebra Appl. 43 (2010, in press).

  21. S. Gandy, B. Recht, and I. Yamada, ‘‘Tensor completion and low-\(n\)-rank tensor recovery via convex optimization,’’ Inverse Probl. 27, 025010 (2011).

  22. J. A. Bengua, H. N. Phien, H. D. Tuan, and M. N. Do, ‘‘Efficient tensor completion for color image and video recovery: Low-rank tensor train,’’ IEEE Trans. Image Process. 26, 2466–2479 (2017).

    Article  MathSciNet  MATH  Google Scholar 

  23. H. Rauhut, R. Schneider, and Ž. Stojanac, ‘‘Tensor completion in hierarchical tensor representation,’’ in Compressed Sensing and Its Applications: Proceedings of the MATHEON Workshop 2013 (2015), pp. 419–450.

  24. L. Grasedyck, M. Kluge, and S. Krämer, ‘‘Variants of alternating least squares tensor completion in the tensor train format,’’ SIAM J. Sci. Comput. 37, A2424–A2450 (2015).

    Article  MathSciNet  MATH  Google Scholar 

  25. L. Grasedyck and S. Krämer, ‘‘Stable ALS approximation in the TT-format for rank-adaptive tensor completion,’’ Numer. Math. 143, 855–904 (2019).

    Article  MathSciNet  MATH  Google Scholar 

  26. D. Kressner, M. Steinlechner, and B. Vandereycken, ‘‘Low-rank tensor completion by Riemannian optimization,’’ BIT Numer. Math. 54, 447–468 (2014).

    Article  MathSciNet  MATH  Google Scholar 

  27. M. Steinlechner, ‘‘Riemannian optimization for high-dimensional tensor completion,’’ SIAM J. Sci. Comput. 38, S461–S484 (2016).

    Article  MathSciNet  MATH  Google Scholar 

  28. S. Budzinskiy and N. Zamarashkin, ‘‘Tensor train completion: Local recovery guarantees via riemannian optimization,’’ arXiv: 2110.03975 (2021).

  29. V. de Silva and L. Lim, ‘‘Tensor rank and the ill-posedness of the best low-rank approximation problem,’’ SIAM J. Matrix Anal. Appl. 30, 1084–1127 (2008).

    Article  MathSciNet  MATH  Google Scholar 

  30. G. Tomasi and R. Bro, ‘‘PARAFAC and missing values,’’ Chemom. Intell. Lab. Syst. 75, 163–180 (2005).

    Article  Google Scholar 

  31. E. Acar, D. M. Dunlavy, T. G. Kolda, and M. Mørup, ‘‘Scalable tensor factorizations for incomplete data,’’ Chemom. Intell. Lab. Syst. 106, 41–56 (2011).

    Article  Google Scholar 

  32. T. Yokota, Q. Zhao, and A. Cichochki, ‘‘Smooth PARAFAC decomposition for tensor completion,’’ IEEE Trans. Signal Process. 64, 5423–5436 (2016).

    Article  MathSciNet  MATH  Google Scholar 

  33. R. Salakhutdinov and A. Mnih, ‘‘Probabilistic matrix factorization,’’ in Proceedings of the Conference on Advances in Neural Information Processing Systems NIPS 2007 (2008), Vol. 20.

  34. R. M. Neal, Tech. Report CRG-TR-93-1 (Dep. Computer Sci., Univ. of Toronto, 1993).

  35. M. J. Beal, Ph. D. Thesis (Univ. College, London, 2003).

  36. J. Winn and C. M. Bishop, ‘‘Variational message passing,’’ J. Mach. Learn. Res. 6, 661–694 (2005).

    MathSciNet  MATH  Google Scholar 

  37. R. Salakhutdinov and A. Mnih, ‘‘Bayesian probabilistic matrix factorization using Markov chain Monte Carlo,’’ in Proceedings of the International Conference on Machine Learning, ICML 2008 (2008), pp. 880–887.

  38. D. E. Gilbert and M. T. Wells, ‘‘Tuning free rank-sparse bayesian matrix and tensor completion with global-local priors,’’ arXiv: 1905.11496 (2019).

  39. B. Lakshminarayanan, G. Bouchard, and C. Archambeau, ‘‘Robust bayesian matrix factorisation,’’ in Proceedings of the International Conference on Artificial Intelligence and Statistics AISTATS 2011 (2011), pp. 425–433.

  40. S. D. Babacan, M. Luessi, R. Molina, and A. K. Katsaggelos, ‘‘Sparse bayesian methods for low-rank matrix estimation,’’ IEEE Trans. Signal Process. 60, 3964–3977 (2012).

    Article  MathSciNet  MATH  Google Scholar 

  41. Y. Linxiao, J. Fang, H. Duan, H. Li, and B. Zeng, ‘‘Fast low-rank bayesian matrix completion with hierarchical gaussian prior models,’’ IEEE Trans. Signal Process. 66, 2804–2817 (2018).

    Article  MathSciNet  MATH  Google Scholar 

  42. W. Chu and Z. Ghahramani, ‘‘Probabilistic models for incomplete multi-dimensional arrays,’’ in Proceedings of the International Conference on Artificial Intelligence and Statistics AISTATS 2009 (2009), pp. 89–96.

  43. Q. Zhao, L. Zhang, and A. Cichocki, ‘‘Bayesian sparse Tucker models for dimension reduction and tensor completion,’’ arXiv: 1505.02343 (2015).

  44. L. Xu, L. Cheng, N. Wong, and Y. Wu, ‘‘Learning tensor train representation with automatic rank determination from incomplete noisy data,’’ arXiv: 2010.06564 (2020).

  45. P. Rai, Y. Wang, S. Guo, G. Chen, D. Dunson, and L. Carin, ‘‘Scalable bayesian low-rank decomposition of incomplete multiway tensors,’’ in Proceedings of the International Conference on Machine Learning, ICML 2014 (2014), pp. 1800–1808.

  46. Q. Zhao, L. Zhang, and A. Cichocki, ‘‘Bayesian CP factorization of incomplete tensors with automatic rank determination,’’ IEEE Trans. Pattern Anal. Mach. Intell. 37, 1751–1763 (2015).

    Article  Google Scholar 

  47. Q. Zhao, G. Zhou, L. Zhang, A. Cichocki, and S. Amari, ‘‘Bayesian robust tensor factorization for incomplete multiway data,’’ IEEE Trans. Neural Network Learn. Syst. 27, 736–748 (2016).

    Article  MathSciNet  Google Scholar 

  48. L. Cheng, Z. Chen, Q. Shi, Y. Wu, and S. Theodoridis, ‘‘Towards probabilistic tensor canonical polyadic decomposition 2.0: Automatic tensor rank learning using generalized hyperbolic prior,’’ arXiv: 2009.02472 (2020).

  49. P. Alquier, V. Cottet, N. Chopin, and J. Rousseau, ‘‘Bayesian matrix completion: Prior specification,’’ arXiv: 1406.1440 (2014).

  50. E. Acar, T. G. Kolda, and D. M. Dunlavy, ‘‘All-at-once optimization for coupled matrix and tensor factorizations,’’ arXiv: 1105.3422 (2011).

  51. A. Narita, K. Hayashi, R. Tomioka, and H. Kashima, ‘‘Tensor factorization using auxiliary information,’’ Data Mining Knowledge Discov. 25, 298–324 (2012).

    Article  MathSciNet  MATH  Google Scholar 

  52. T. Yokota, A. Cichocki, and Y. Yamashita, ‘‘Linked PARAFAC/CP tensor decomposition and its fast implementation for multi-block tensor analysis,’’ in Neural Information Processing, Proceedings of the Annual Conference (2012), pp. 84–91.

  53. J. A. Bazerque, G. Mateos, and G. B. Giannakis, ‘‘Rank regularization and bayesian inference for tensor completion and extrapolation,’’ IEEE Trans. Signal Process. 61, 5689–5703 (2013).

    Article  MathSciNet  MATH  Google Scholar 

  54. Y. Wu, H. Tan, Y. Li, J. Zhang, and C. Xiaoxuan, ‘‘A fused CP factorization method for incomplete tensors,’’ IEEE Trans. Neural Network Learn. Syst. 30, 751–764 (2019).

    Article  Google Scholar 

  55. Y. Guan, S. Dong, P. A. Absil, and F. Glineur, ‘‘Alternating minimization algorithms for graph regularized tensor completion,’’ arXiv: 2008.12876 (2020).

  56. V. N. Ioannidis, A. S. Zamzam, G. B. Giannakis, and N. D. Sidiropoulos, ‘‘Coupled graphs and tensor factorization for recommender systems and community detection,’’ IEEE Trans. Knowledge Data Eng. 33, 909–920 (2021).

    Google Scholar 

  57. C. Yang, N. Singh, C. Xiao, C. Qian, E. Solomonik, and J. Sun, ‘‘MTC: Multiresolution tensor completion from partial and coarse observations,’’ in Proceedings of the ACM SIGKDD Conference on Knowledge Discovery and Data Mining KDD 2021 (2021), pp. 1953–1963.

  58. B. Ermiş, E. Acar, and A. T. Cemgil, ‘‘Link prediction in heterogeneous data via generalized coupled tensor factorization,’’ Data Mining Knowledge Discov. 29, 203–236 (2015).

    Article  MathSciNet  Google Scholar 

  59. S. Budzinskiy and N. Zamarashkin, ‘‘Note: Low-rank tensor train completion with side information based on riemannian optimization,’’ arXiv: 2006.12798 (2020).

  60. Z. Long, C. Zhu, J. Liu, P. Comon, and Y. Liu, ‘‘Trainable subspaces for low rank tensor completion: Model and analysis,’’ IEEE Trans. Signal Process. 70, 2502–2517 (2022).

    Article  MathSciNet  Google Scholar 

  61. S. Budzinskiy and N. Zamarashkin, ‘‘Variational bayesian inference for CP tensor completion with side information,’’ arXiv: 2206.12486 (2022).

  62. F. Schäfer, M. Katzfuss, and H. Owhadi, ‘‘Sparse Cholesky factorization by Kullback–Leibler minimization,’’ SIAM J. Sci. Comput. 43, A2019–A2046 (2021).

    Article  MathSciNet  MATH  Google Scholar 

Download references

Funding

This work was supported by Russian Science Foundation (project no. 21-71-10072).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to S. Budzinskiy or N. Zamarashkin.

Additional information

(Submitted by E. E. Tyrtyshnikov)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Budzinskiy, S., Zamarashkin, N. Variational Bayesian Inference for CP Tensor Completion with Subspace Information. Lobachevskii J Math 44, 3016–3027 (2023). https://doi.org/10.1134/S1995080223080103

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S1995080223080103

Keywords:

Navigation