Abstract
In this chapter, we will first introduce the preliminaries on tensors, including terminologies and the associated notations, related multi-linear algebra, and more importantly, widely used tensor decomposition formats. Then, we link the tensor decompositions to the recent representation learning for multi-dimensional data, showing the paramount role of tensors in modern signal processing and machine learning. Finally, we review the recent algorithms for tensor decompositions, and further analyze their common challenge in rank determination.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
T.G. Kolda, B.W. Bader, Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009)
B.W. Bader, T.G. Kolda, et al., Matlab tensor toolbox version 3.1 (2019)
J.H.d.M. Goulart, M. Boizard, R. Boyer, G. Favier, P. Comon, Tensor cp decomposition with structured factor matrices: algorithms and performance. IEEE J. Selected Topics Signal Process. 10(4), 757–769 (2015)
V. Bhatt, S. Kumar, S. Saini, Tucker decomposition and applications. Mater. Today: Proc. (2021)
I.V. Oseledets, Tensor-train decomposition. SIAM J. Sci. Comput. 33(5), 2295–2317 (2011)
H. Johan, Tensor rank is np-complete. J. Algor. 4(11), 644–654 (1990)
J.D. Carroll, J.-J. Chang, Analysis of individual differences in multidimensional scaling via an n-way generalization of “eckart-young” decomposition. Psychometrika 35(3), 283–319 (1970)
X. Fu, N. Vervliet, L. De Lathauwer, K. Huang, N. Gillis, Computing large-scale matrix and tensor decomposition with structured factors: a unified nonconvex optimization perspective. IEEE Signal Process. Mag. 37(5), 78–94 (2020)
B. Yang, A.S. Zamzam, N.D. Sidiropoulos, Large scale tensor factorization via parallel sketches. IEEE Trans. Knowl. Data Eng. (2020)
C.A. Andersson, R. Bro, Improving the speed of multi-way algorithms: part i. tucker3. Chemom. Intell. Lab. Syst. 42(1–2), 93–103 (1998)
S. Theodoridis, Machine Learning: a Bayesian and Optimization Perspective, 2nd edn. (Academic, Cambridge, 2020)
L. Cheng, X. Tong, S. Wang, Y.-C. Wu, H.V. Poor, Learning nonnegative factors from tensor data: probabilistic modeling and inference algorithm. IEEE Trans. Signal Process. 68, 1792–1806 (2020)
L. Xu, L. Cheng, N. Wong, Y.-C. Wu, Overfitting avoidance in tensor train factorization and completion: prior analysis and inference, in International Conference on Data Mining (ICDM) (2021)
L. Cheng, Y.-C. Wu, H.V. Poor, Scaling probabilistic tensor canonical polyadic decomposition to massive data. IEEE Trans. Signal Process. 66(21), 5534–5548 (2018)
L. Cheng, Y.-C. Wu, H.V. Poor, Probabilistic tensor canonical polyadic decomposition with orthogonal factors. IEEE Trans. Signal Process. 65(3), 663–676 (2016)
Y. Zhou, Y.-M. Cheung, Bayesian low-tubal-rank robust tensor factorization with multi-rank determination. IEEE Trans. Pattern Anal. Mach. Intell. 62–76 (2019)
Z. Zhang, C. Hawkins, Variational bayesian inference for robust streaming tensor factorization and completion, in Proceeding of the IEEE International Conference on Data Mining (ICDM) (2018), pp. 1446–1451
Q. Zhao, L. Zhang, A. Cichocki, Bayesian CP factorization of incomplete tensors with automatic rank determination. IEEE Trans. Pattern Anal. Mach. Intell. 37(9), 1751–1763 (2015)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Cheng, L., Chen, Z., Wu, YC. (2023). Tensor Decomposition: Basics, Algorithms, and Recent Advances. In: Bayesian Tensor Decomposition for Signal Processing and Machine Learning. Springer, Cham. https://doi.org/10.1007/978-3-031-22438-6_1
Download citation
DOI: https://doi.org/10.1007/978-3-031-22438-6_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-22437-9
Online ISBN: 978-3-031-22438-6
eBook Packages: EngineeringEngineering (R0)