Abstract
In Chapter 5, I discussed several adaptive algorithms for computing principal and minor eigenvectors of the online correlation matrix Ak∈ℜnXn from a sequence of vectors {xk∈ℜn}. I derived these algorithms by applying the gradient descent on an objective function. However, it is well known [Baldi and Hornik 95, Chatterjee et al. Mar 98, Haykin 94] that principal component analysis (PCA) algorithms based on gradient descents are slow to converge. Furthermore, both analytical and experimental studies show that convergence of these algorithms depends on appropriate selection of the gain sequence {ηk}. Moreover, it is proven [Chatterjee et al. Nov 97; Chatterjee et al. Mar 98; Chauvin 89] that if the gain sequence exceeds an upper bound, then the algorithms may diverge or converge to a false solution.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to APress Media, LLC, part of Springer Nature
About this chapter
Cite this chapter
Chatterjee, C. (2022). Accelerated Computation of Eigenvectors. In: Adaptive Machine Learning Algorithms with Python. Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-8017-1_6
Download citation
DOI: https://doi.org/10.1007/978-1-4842-8017-1_6
Published:
Publisher Name: Apress, Berkeley, CA
Print ISBN: 978-1-4842-8016-4
Online ISBN: 978-1-4842-8017-1
eBook Packages: Professional and Applied ComputingApress Access BooksProfessional and Applied Computing (R0)