Log in

Low tensor-ring rank completion: parallel matrix factorization with smoothness on latent space

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In recent years, tensor ring (TR) decomposition has drawn a lot of attention and was successfully applied to tensor completion problem, due to its more compact representation ability. As well known, both global and local structural information is important for tensor completion problem. Although the existing TR-based completion algorithms obtain the impressive performance in visual-data inpainting by using low-rank global structure information, most of them didn’t take into account local smooth property which is often exhibited in visual data. To further improve visual-data inpainting performance, both low-rank and piecewise smooth structures are incorporated in our model. Instead of directly applying local smooth constraint on the data surface, we impose the smoothness on its latent TR-space, which greatly reduces computational cost especially for large-scale data. Extensive experiments on real-world visual data show that our model not only obtains the state-of-the-art performance, but also is rather stable to the TR-ranks owing to the local smooth constraint.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Data availability

The data that support the finding of this study are available from the corresponding author or first author upon reasonable request.

References

  1. Meng Z, Zhou Y, Zhao Y (2019) Unsupervised learning low-rank tensor from incomplete and grossly corrupted data. Neural Comput Appl 31(12):8327–8335

    Article  Google Scholar 

  2. Liu J, Musialski P, Wonka P, Ye J (2012) Tensor completion for estimating missing values in visual data. IEEE Trans Pattern Anal Mach Intell 35(1):208–220

    Article  Google Scholar 

  3. Al-Obeidat F, Rocha Á, Khan MS, Maqbool F, Razzaq S (2021) Parallel tensor factorization for relational learning. Neural Comput Appl 34(11):8455–8464

    Article  Google Scholar 

  4. Fan J (2021) Multi-mode deep matrix and tensor factorization. International Conference on Learning Representations

  5. Candès EJ, Recht B (2009) Exact matrix completion via convex optimization. Found Comput Math 9(6):717–772

    Article  MathSciNet  MATH  Google Scholar 

  6. Fan J, Ding L, Chen Y, Udell M (2019) Factor group-sparse regularization for efficient low-rank matrix recovery. Adv Neural Inf Process Syst 33:5104-5114

  7. Bro R (1997) PARAFAC. Tutorial and applications. Chemom Intell Lab Syst 38:149–171

    Article  Google Scholar 

  8. Zhou G, Cichocki A, **e S (2013) Accelerated canonical polyadic decomposition using mode reduction. IEEE Trans Neural Netw Learn Syst 24(12):2051–2062. https://doi.org/10.1109/TNNLS.2013.2271507

    Article  Google Scholar 

  9. Sørensen M, De Lathauwer L (2015) New uniqueness conditions for the canonical polyadic decomposition of third-order tensors. SIAM J Matrix Anal Appl 36(4):1381–1403

    Article  MathSciNet  MATH  Google Scholar 

  10. Domanov I, De Lathauwer L (2017) Canonical polyadic decomposition of third-order tensors: relaxed uniqueness conditions and algebraic algorithm. Linear Algebra Appl 513:342–375

    Article  MathSciNet  MATH  Google Scholar 

  11. Gong X-F, Lin Q-H, Cong F-Y, De Lathauwer L (2018) Double coupled canonical polyadic decomposition for joint blind source separation. IEEE Trans Signal Process 66(13):3475–3490

    Article  MathSciNet  MATH  Google Scholar 

  12. Tucker LR (1966) Some mathematical notes on three-mode factor analysis. Psychometrika 31(3):279–311. https://doi.org/10.1007/BF02289464

    Article  MathSciNet  Google Scholar 

  13. Tan Q, Yang P, Wen G (2021) Deep non-negative tensor factorization with multi-way emg data. Neural Comput Appl 34:1307–1317

    Article  Google Scholar 

  14. Yang L, Fang J, Li H, Zeng B (2016) An iterative reweighted method for tucker decomposition of incomplete tensors. IEEE Trans Signal Process 64(18):4817–4829

    Article  MathSciNet  MATH  Google Scholar 

  15. Li X, Ng MK, Cong G, Ye Y, Wu Q (2017) Mr-ntd: manifold regularization nonnegative tucker decomposition for tensor data dimension reduction and representation. IEEE Trans Neural Netw Learn Syst 28(8):1787–1800

    Article  MathSciNet  Google Scholar 

  16. Smith S, Karypis G (2017) Accelerating the tucker decomposition with compressed sparse tensors. Springer, Berlin, pp 653–668

    Google Scholar 

  17. Chen X et al (2018) A generalized model for robust tensor factorization with noise modeling by mixture of gaussians. IEEE Trans Neural Netw Learn Syst 99:1–14

    MathSciNet  Google Scholar 

  18. Oseledets IV (2011) Tensor-train decomposition. SIAM J Sci Comput 33(5):2295–2317. https://doi.org/10.1137/090752286

    Article  MathSciNet  MATH  Google Scholar 

  19. Bigoni D, Engsig-Karup AP, Marzouk YM (2016) Spectral tensor-train decomposition. SIAM J Sci Comput 38(4):A2405–A2439

    Article  MathSciNet  MATH  Google Scholar 

  20. Huber B, Schneider R, Wolf S (2017) A randomized tensor train singular value decomposition. Springer, Berlin, pp 261–290

    Google Scholar 

  21. Chen Z, Batselier K, Suykens JA, Wong N (2017) Parallelized tensor train learning of polynomial classifiers. IEEE Trans Neural Netw Learn Syst 99:1–12

    Google Scholar 

  22. Dian R, Li S, Fang L (2019) Learning a low tensor-train rank representation for hyperspectral image super-resolution. IEEE Trans Neural Netw Learn Syst 30:2672–2683

    Article  MathSciNet  Google Scholar 

  23. Hackbusch W (2012) Tensor spaces and numerical tensor calculus, vol 42. Springer, Berlin

    MATH  Google Scholar 

  24. Zhao Q, Zhou G, **e S, Zhang L, Cichocki A (2016) Tensor ring decomposition. CoRR ar**v:1606.05535

  25. Zhao Q, Sugiyama M, Yuan L, Cichocki A (2019) Learning efficient tensor representations with ring-structured networks. IEEE, New York, pp 8608–8612

    Google Scholar 

  26. Wang W, Aggarwal V, Aeron S (2017) Efficient low rank tensor ring completion. Proceedings of the IEEE International Conference on Computer Vision, pp 5698–5706

  27. Yuan L, Li C, Mandic D, Cao J, Zhao Q (2019) Tensor ring decomposition with rank minimization on latent space: an efficient approach for tensor completion. Proc Conf AAAI Artif Intell 33(01):9151–9158

  28. Yu J, Li C, Zhao Q, Zhao G (2019) Tensor-ring nuclear norm minimization and application for visual-data completion. ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp 3142–3146

  29. Yu J, Zhou G, Li C, Zhao Q, **e S (2020) Low tensor-ring rank completion by parallel matrix factorization. IEEE Trans Neural Netw Learn Syst 32:3020–3033

    Article  MathSciNet  Google Scholar 

  30. Chen Y-L, Hsu C-T, Liao H-YM (2014) Simultaneous tensor decomposition and completion using factor priors. IEEE Trans Pattern Anal Mach Intell 36(3):577–591

    Article  Google Scholar 

  31. Zhao Q, Zhang L, Cichocki A (2015) Bayesian cp factorization of incomplete tensors with automatic rank determination. IEEE Trans Pattern Anal Mach Intell 37(9):1751–1763. https://doi.org/10.1109/TPAMI.2015.2392756

    Article  Google Scholar 

  32. Yokota T, Zhao Q, Cichocki A (2016) Smooth parafac decomposition for tensor completion. IEEE Trans Signal Process 64(20):5423–5436

    Article  MathSciNet  MATH  Google Scholar 

  33. Li X, Ye Y, Xu X (2017) Low-rank tensor completion with total variation for visual data inpainting. Proc Conf AAAI Artif Intell 31(1):2210–2216

  34. He W, Yokoya N, Yuan L, Zhao Q (2019) Remote sensing image reconstruction using tensor ring completion and total variation. IEEE Trans Geosci Remote Sens 57(11):8998–9009

    Article  Google Scholar 

  35. Ji T-Y, Huang T-Z, Zhao X-L, Ma T-H, Liu G (2016) Tensor completion using total variation and low-rank matrix factorization. Inf Sci 326:243–257

    Article  MathSciNet  MATH  Google Scholar 

  36. Jiang T-X, Huang T-Z, Zhao X-L, Ji T-Y, Deng L-J (2018) Matrix factorization for low-rank tensor completion using framelet prior. Inf Sci 436:403–417

    Article  MathSciNet  MATH  Google Scholar 

  37. Zheng Y-B et al (2019) Low-rank tensor completion via smooth matrix factorization. Appl Math Model 70:677–695

    Article  MathSciNet  MATH  Google Scholar 

  38. Kolda TG, Bader BW (2009) Tensor decompositions and applications. SIAM Rev 51(3):455–500. https://doi.org/10.1137/07070111X

    Article  MathSciNet  MATH  Google Scholar 

  39. Bengua JA, Phien HN, Tuan HD, Do MN (2017) Efficient tensor completion for color image and video recovery: low-rank tensor train. IEEE Trans Image Process 26(5):2466–2479

    Article  MathSciNet  MATH  Google Scholar 

  40. Yuan L, Zhao Q, Cao J (2018) High-order tensor completion for data recovery via sparse tensor-train optimization. IEEE, New York, pp 1258–1262

    Google Scholar 

  41. Yuan L, Cao J, Wu Q, Zhao Q (2018) Higher-dimension tensor completion via low-rank tensor ring decomposition. ar**v:1807.01589

  42. Wang Z, Bovik AC, Sheikh HR, Simoncelli EP (2004) Image quality assessment: from error visibility to structural similarity. IEEE Trans Image Process 13(4):600–612

    Article  Google Scholar 

  43. Liu J, Musialski P, Wonka P, Ye J (2013) Tensor completion for estimating missing values in visual data. IEEE Trans Pattern Anal Mach Intell 35(1):208–220. https://doi.org/10.1109/TPAMI.2012.39

    Article  Google Scholar 

Download references

Acknowledgements

This work has been supported in part by the National Natural Science Foundation of China (No. 62203128, 52171331), in part by the Guangdong Province Key Field R &D Program, China (No. 2020B0101050001), and in part by the Science and Technology Planning Project of Guangzhou City under Grants 202102010411.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tao Zou.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yu, J., Zou, T. & Zhou, G. Low tensor-ring rank completion: parallel matrix factorization with smoothness on latent space. Neural Comput & Applic 35, 7003–7016 (2023). https://doi.org/10.1007/s00521-022-08023-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-08023-5

Keywords

Navigation