Abstract
In this paper, the authors propose a new training algorithm which does not only rely upon the training samples, but also depends upon the output of the hidden layer. We adjust both the connecting weights and outputs of the hidden layer based on Least Square Backpropagation (LSB) algorithm. A set of ‘required’ outputs of the hidden layer is added to the input sets through a feedback path to accelerate the convergence speed. The numerical simulation results have demonstrated that the algorithm is better than conventional BP, Quasi-Newton BFGS (an alternative to the conjugate gradient methods for fast optimisation) and LSB algorithms in terms of convergence speed and training error. The proposed method does not suffer from the drawback of the LSB algorithm, for which the training error cannot be further reduced after three iterations.
Similar content being viewed by others
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Li, Y., Rad, A. & Peng, W. An Enhanced Training Algorithm for Multilayer Neural Networks Based on Reference Output of Hidden Layer. Neural Comput & Applic 8, 218–225 (1999). https://doi.org/10.1007/s005210050024
Published:
Issue Date:
DOI: https://doi.org/10.1007/s005210050024