Abstract
In this paper, a kind of stochastic configuration broad learning system (SCBLS) is proposed for data modeling. The proposed SCBLS is established in the form of a flat network and its architecture is determined by a constructive learning approach. The input parameters of feature nodes and enhancement nodes of SCBLS are randomly assigned in the light of a supervisory mechanism. Inequality constraints are used to randomly assign the hidden parameters and adaptively select the scopes of random parameters. The output parameters of SCBLS are determined either by a constructive manner or by solving a global least squares problem. It is proved that the proposed SCBLS possesses universal approximation properties. The performances of the proposed SCBLS are evaluated by function approximation, benchmark datasets and time series prediction. Numerical examples show that SCBLS can achieve satisfactory approximation accuracy.
Similar content being viewed by others
Availability of data and material
All data used during this study are available from the corresponding author on reasonable request.
Code availability
The code used during the current study are available from the corresponding author on reasonable request
Notes
KEEL: http://www.keel.es/
TSDL: https:/ /datamarket.com /data /list /?q = provider:tsdl.
References
Chen J (2012) Structural vibration suppression by using neural classifier with genetic algorithm. Int J Mach Learn Cybernet 3(3):215–221
Barakat M, Lefebvre M, Khalil M et al (2013) Parameter selection algorithm with self-adaptive growing neural network classifier for diagnosis issues. Int J Mach Learn Cybernet 4(3):217–233
Rech PC (2015) Period-adding and spiral organization of the periodicity in a Hopfield neural network. Int J Mach Learn Cybernet 6(1):1–6
He Q, Shang TF, Zhuang FZ et al (2013) Parallel extreme learning machine for regression based on MapReduce. Neurocomputing 102:52–58
Wang DG, Song WY, Pedrycz W et al (2021) An integrated neural network with nonlinear output structure for interval-valued data. J Intell Fuzzy Syst 40(1):673–683
Wang DG, Song WY, Pedrycz W (2018) A two stage forecasting approach for interval-valued time series. J Intell Fuzzy Syst 35(2):2501–2512
Zhang XY, Wang DG, Ota K et al (2020) Exponential stability of mixed time-delay neural networks based on switching approaches. IEEE Trans Cybernet. https://doi.org/10.1109/TCYB.2020.2985777
He Q, ** X, Du CY et al (2014) Clustering in extreme learning machine feature space. Neurocomputing 128:88–95
Min F, Zhang S, Ciucci D et al (2020) Three-way active learning through clustering selection. Int J Mach Learn Cybernet 11:1033–1046
Zhang YX, Huang DG, Lin HM et al (2020) Knowledge reasoning approach with linguistic-valued intuitionistic fuzzy credibility. Int J Mach Learn Cybernet 11:169–184
Zou L, Wen X, Wang YX (2016) Linguistic truth-valued intuitionistic fuzzy reasoning with applications in human factors engineering. Inf Sci 327:201–216
Liu X, Wang Y, Li XN et al (2017) A linguistic-valued approximate reasoning approach for financial decision making. Int J Comput Intell syst 10:312–319
Igelnik B, Pao YH (1995) Stochastic choice of basis functions in adaptive function approximation and the functional-link net. IEEE Trans Neural Netw 6(6):1320–1329
Pao YH, Park GH, Sobajic DJ (1994) Learning and generalization characteristics of the random vector Functional-link net. Neurocomputing 6(2):163–180
Chen CLP, Liu Z (2018) Broad learning system: an effective and efficient incremental learning system without the need for deep architecture. IEEE Trans Neural Netw Learn Syst 29(1):10–24
Chen CLP, Liu Z, Feng S (2019) Universal approximation capability of broad learning system and its structural variations. IEEE Trans Neural Netw Learn Syst 30(4):1191–1204
Feng S, Chen CLP (2020) Fuzzy broad learning system: a novel neuro-fuzzy model for regression and classification. IEEE Trans Cybernet 50(2):414–424
Han M, Feng S, Chen CLP (2019) Structured manifold broad learning system: a manifold perspective for large-scale chaotic time series analysis and prediction. IEEE Trans Knowl Data Eng 31(9):1809–1821
Xu M, Han M, Chen CLP (2020) Recurrent broad learning systems for time series prediction. IEEE Trans Cybernet 50(4):1405–1417
Liu Z, Chen CLP, Feng S et al (2021) Stacked broad learning system: from incremental flatted structure to deep model. IEEE Trans Syst Man Cybern 51(1):209–222
Huang GB, Chen L, Siew CK (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17(4):879–892
Huang GB, Chen L (2008) Enhanced random search based incremental extreme learning machine. Neurocomputing 71(16–18):3460–3468
Qiao J, Li F, Han H et al (2016) Constructive algorithm for fully connected cascade feedforward neural networks. Neurocomputing 182(19):154–164
Li M, Wang D (2017) Insights into randomized algorithms for neural networks: Practical issues and common pitfalls. Inf Sci 382–383:170–178
Gorban AN, Tyukin IY, Prokhorov DV et al (2016) Approximation with random bases: Pro et Contra. Inf Sci 364–365:129–145
Wang D, Li M (2017) Stochastic configuration networks: fundamentals and algorithms. IEEE Trans Cybern 47(10):3466–3479
Wang D, Li M (2017) Robust stochastic configuration networks with kernel density estimation for uncertain data regression. Inf Sci 412–413:210–222
Pratama M, Wang D (2019) Deep stacked stochastic configuration networks for lifelong learning of non-stationary data streams. Inf Sci 495:150–174
Wang Q, Dai W, Ma X et al (2020) Driving amount based stochastic configuration network for industrial process modeling. Neurocomputing 394:61–69
Acknowledgements
This work is supported by the National Natural Science Foundation (NNSF) of China under Grant (61773088, 12071056), the Fundamental Research Funds for the Central Universities (DUT20JC30) and the National Key R&D Program of China (2018AAA0100300).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Zhou, W., Wang, D., Li, H. et al. Stochastic configuration broad learning system and its approximation capability analysis. Int. J. Mach. Learn. & Cyber. 13, 797–810 (2022). https://doi.org/10.1007/s13042-021-01341-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-021-01341-5