Tensor Decomposition: Basics, Algorithms, and Recent Advances

  • Chapter
  • First Online:
Bayesian Tensor Decomposition for Signal Processing and Machine Learning

Abstract

In this chapter, we will first introduce the preliminaries on tensors, including terminologies and the associated notations, related multi-linear algebra, and more importantly, widely used tensor decomposition formats. Then, we link the tensor decompositions to the recent representation learning for multi-dimensional data, showing the paramount role of tensors in modern signal processing and machine learning. Finally, we review the recent algorithms for tensor decompositions, and further analyze their common challenge in rank determination.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. T.G. Kolda, B.W. Bader, Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  2. B.W. Bader, T.G. Kolda, et al., Matlab tensor toolbox version 3.1 (2019)

    Google Scholar 

  3. J.H.d.M. Goulart, M. Boizard, R. Boyer, G. Favier, P. Comon, Tensor cp decomposition with structured factor matrices: algorithms and performance. IEEE J. Selected Topics Signal Process. 10(4), 757–769 (2015)

    Google Scholar 

  4. V. Bhatt, S. Kumar, S. Saini, Tucker decomposition and applications. Mater. Today: Proc. (2021)

    Google Scholar 

  5. I.V. Oseledets, Tensor-train decomposition. SIAM J. Sci. Comput. 33(5), 2295–2317 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  6. H. Johan, Tensor rank is np-complete. J. Algor. 4(11), 644–654 (1990)

    MathSciNet  MATH  Google Scholar 

  7. J.D. Carroll, J.-J. Chang, Analysis of individual differences in multidimensional scaling via an n-way generalization of “eckart-young” decomposition. Psychometrika 35(3), 283–319 (1970)

    Google Scholar 

  8. X. Fu, N. Vervliet, L. De Lathauwer, K. Huang, N. Gillis, Computing large-scale matrix and tensor decomposition with structured factors: a unified nonconvex optimization perspective. IEEE Signal Process. Mag. 37(5), 78–94 (2020)

    Article  Google Scholar 

  9. B. Yang, A.S. Zamzam, N.D. Sidiropoulos, Large scale tensor factorization via parallel sketches. IEEE Trans. Knowl. Data Eng. (2020)

    Google Scholar 

  10. C.A. Andersson, R. Bro, Improving the speed of multi-way algorithms: part i. tucker3. Chemom. Intell. Lab. Syst. 42(1–2), 93–103 (1998)

    Article  Google Scholar 

  11. S. Theodoridis, Machine Learning: a Bayesian and Optimization Perspective, 2nd edn. (Academic, Cambridge, 2020)

    Google Scholar 

  12. L. Cheng, X. Tong, S. Wang, Y.-C. Wu, H.V. Poor, Learning nonnegative factors from tensor data: probabilistic modeling and inference algorithm. IEEE Trans. Signal Process. 68, 1792–1806 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  13. L. Xu, L. Cheng, N. Wong, Y.-C. Wu, Overfitting avoidance in tensor train factorization and completion: prior analysis and inference, in International Conference on Data Mining (ICDM) (2021)

    Google Scholar 

  14. L. Cheng, Y.-C. Wu, H.V. Poor, Scaling probabilistic tensor canonical polyadic decomposition to massive data. IEEE Trans. Signal Process. 66(21), 5534–5548 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  15. L. Cheng, Y.-C. Wu, H.V. Poor, Probabilistic tensor canonical polyadic decomposition with orthogonal factors. IEEE Trans. Signal Process. 65(3), 663–676 (2016)

    Google Scholar 

  16. Y. Zhou, Y.-M. Cheung, Bayesian low-tubal-rank robust tensor factorization with multi-rank determination. IEEE Trans. Pattern Anal. Mach. Intell. 62–76 (2019)

    Google Scholar 

  17. Z. Zhang, C. Hawkins, Variational bayesian inference for robust streaming tensor factorization and completion, in Proceeding of the IEEE International Conference on Data Mining (ICDM) (2018), pp. 1446–1451

    Google Scholar 

  18. Q. Zhao, L. Zhang, A. Cichocki, Bayesian CP factorization of incomplete tensors with automatic rank determination. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1751–1763 (2015)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lei Cheng .

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Cheng, L., Chen, Z., Wu, YC. (2023). Tensor Decomposition: Basics, Algorithms, and Recent Advances. In: Bayesian Tensor Decomposition for Signal Processing and Machine Learning. Springer, Cham. https://doi.org/10.1007/978-3-031-22438-6_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-22438-6_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-22437-9

  • Online ISBN: 978-3-031-22438-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics

Navigation