Abstract
In this paper, we propose an improved learning algorithm named self-adaptive evolutionary extreme learning machine (SaE-ELM) for single hidden layer feedforward networks (SLFNs). In SaE-ELM, the network hidden node parameters are optimized by the self-adaptive differential evolution algorithm, whose trial vector generation strategies and their associated control parameters are self-adapted in a strategy pool by learning from their previous experiences in generating promising solutions, and the network output weights are calculated using the Moore–Penrose generalized inverse. SaE-ELM outperforms the evolutionary extreme learning machine (E-ELM) and the different evolutionary Levenberg–Marquardt method in general as it could self-adaptively determine the suitable control parameters and generation strategies involved in DE. Simulations have shown that SaE-ELM not only performs better than E-ELM with several manually choosing generation strategies and control parameters but also obtains better generalization performances than several related methods.
Similar content being viewed by others
References
Huang G-B, Zhu Q-Y, Siew C-K (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of international joint conference on neural networks, Budapest, Hungary, pp 985–990
Huang G-B, Zhu Q-Y, Siew C-K (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1-3): 489–501
Cao JW, Lin ZP, Huang G-B (2011) Composite function wavelet neural networks with differential evolution and extreme learning machine. Neural Process Lett 33(3): 251–265
Cao JW, Lin ZP, Huang G-B (2012) Voting based extreme learning machine. Inf Sci 185(1): 66–77
Balabin RM, Safieva RZ, Lomakina EI (2010) Gasoline classification using near infrared (NIR) spectroscopy data: comparison of multivariate techniques. Anal Chim Acta 671: 27–35
Balabin RM, Safieva RZ (2011) Biodiesel classification by base stock type (vegetable oil) using near infrared (NIR) spectroscopy data. Anal Chim Acta 689(2): 190–197
Balabin RM, Lomakina EI (2011) Support vector machine regression (LS-SVM)—an alternative to artificial neural networks (ANNs) for the analysis of quantum chemistry data?. Phys Chem Chem Phys 13(24): 11710–11718
Balabin RM, Lomakina EI (2011) Support vector machine regression (SVR/LS-SVM) - an alternative to neural networks (ANN) for analytical chemistry? Comparison of nonlinear methods on near infrared (NIR) spectroscopy data. Analyst 136(8): 1703–1712
Hsu C-W, Lin C-J (2002) A comparison of methods for multiclass support vector machines. IEEE Trans Neural Netw 13(2): 415–425
Levenberg K (1944) A method for the solution of certain non-linear problems in least squares. Q Appl Math 2: 164–168
Hagan MT, Menhaj MB (1994) Training feedforward networks with the marquardt algorithm. IEEE Trans Neural Netw 5(6): 989–993
Huang G-B, Chen L, Siew CK (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17(4): 879–892
Liang N-Y, Huang G-B, Saratchandran P, Sundararajan N (2006) A fast and accurate on-line sequential learning algorithm for feedforward networks. IEEE Trans Neural Netw 17(6): 1411–1423
Huang G-B, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70(16-18): 3056–3062
Huang G-B, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71(16-18): 3460–3468
Rong H-J, Huang G-B, Saratchandran P, Sundararajan N (2009) On-line sequential fuzzy extreme learning machine for function approximation and classification problems. IEEE Trans Syst Man Cybern B 39(4): 1067–1072
Lan Y, Soh YC, Huang G-B (2009) Ensemble of online sequential extreme learning machine. Neurocomputing 72(13-15): 3391–3395
Storn R, Price K (2004) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11(4): 341–359
Ilonen J, Kamarainen JI, Lampinen J (2003) Differential evolution training algorithm for feedforward neural networks. Neural Process Lett 17: 93–105
Subudhi B, Jena D (2008) Differential evolution and levenberg marquardt trained neural network scheme for nonlinear system identification. Neural Process Lett 27: 285–296
Zhu Q-Y, Qin A-K, Suganthan P-N, Huang G-B (2005) Evolutionary extreme learning machine. Pattern Recog 38(10): 1759–1763
Qin A-K, Huang V-L, Suganthan P-N (2009) Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans Evol Comput 13(2): 398–417
Bartlett PL (1998) The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Trans Inf Theory 44(2): 525–536
Cao JW, Lin ZP, Huang G-B (2010) Composite function wavelet neural networks with extreme learning machine. Neurocomputing 73(7-9): 1405–1416
Gämperle R, Müller SD, Koumoutsakos P (2002) A parameter study for differential evolution. In: Proceedings of WSEAS international conference on advances in intelligent systems, fuzzy systems, evolutionary computation. Interlaken, Switzerland, pp 293–298
Ronkkonen J, Kukkonen S, Price KV (2005) Real-parameter optimization with differential evolution. In: Proceedings of IEEE congress evolutionary computation, Edinburgh, Scotland, pp 506–513
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Cao, J., Lin, Z. & Huang, GB. Self-Adaptive Evolutionary Extreme Learning Machine. Neural Process Lett 36, 285–305 (2012). https://doi.org/10.1007/s11063-012-9236-y
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11063-012-9236-y