Log in

On the universal approximation property of radial basis function neural networks

  • Research
  • Published:
Annals of Mathematics and Artificial Intelligence Aims and scope Submit manuscript

Abstract

In this paper we consider a new class of RBF (Radial Basis Function) neural networks, in which smoothing factors are replaced with shifts. We prove under certain conditions on the activation function that these networks are capable of approximating any continuous multivariate function on any compact subset of the d-dimensional Euclidean space. For RBF networks with finitely many fixed centroids we describe conditions guaranteeing approximation with arbitrary precision.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Data Availibility Statement

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

References

  1. Agarwal, V., Bhanot, S.: Radial basis function neural network-based face recognition using firefly algorithm. Neural Comput. Appl. 30(8), 2643–2660 (2018)

    Article  Google Scholar 

  2. Bishop, C.: Improving the generalisation properties of radial basis function neural networks. Neural Comput. 3(4), 579–588 (1991)

    Article  Google Scholar 

  3. Broomhead, D.S., Lowe, D.: Multivariable function interpolation and adaptive networks. Complex Systems 2, 321–355 (1988)

    MathSciNet  Google Scholar 

  4. Fath, A.H., Madanifar, F., Abbasi, M.: Implementation of multilayer perceptron (MLP) and radial basis function (RBF) neural networks to predict solution gas-oil ratio of crude oil systems. Petroleum 6(1), 80–91 (2020)

    Article  Google Scholar 

  5. Funahashi, K.: On the approximate realization of continuous map**s by neural networks. Neural Netw. 2, 183–192 (1989)

    Article  Google Scholar 

  6. Huang, G.B., Chen, L., Siew, C.K.: Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans. Neural Networks 17(4), 879–892 (2006)

    Article  Google Scholar 

  7. Ismailov, V. E.: Ridge functions and applications in neural networks. Mathematical Surveys and Monographs, 263. American Mathematical Society, Providence, RI,186. (2021)

  8. Karamichailidou, D., Kaloutsa, V., Alexandridis, A.: Wind turbine power curve modeling using radial basis function neural networks and tabu search. Renewable Energy 163, 2137–2152 (2021)

    Article  Google Scholar 

  9. Khan, S., Naseem, I., Malik, M.A., Togneri, R., Bennamoun, M.: A fractional gradient descent-based RBF neural networks. Circuits, Systems, Signal Processing. 37, 5311–5332 (2018)

    Article  MathSciNet  Google Scholar 

  10. Li, X., Sun, Y.: Application of RBF neural network optimal segmentation algorithm in credit rating, Neural Computing and Applications. 8227–8235. (2021)

  11. Liao, Y., Fang, S.C., Nuttle, H.L.W.: Relaxed conditions for radial-basis function networks to be universal approximators. Neural Netw. 16(7), 1019–1028 (2003)

    Article  Google Scholar 

  12. Lippman, R.P.: Pattern classification using neural networks. IEEE Commun. Mag. 27, 47–64 (1989)

    Article  Google Scholar 

  13. Marshall, D.E., O’Farrell, A.G.: Uniform approximation by real functions. Fund. Math. 104, 203–211 (1979)

    Article  MathSciNet  Google Scholar 

  14. Moody, J., Darken, C.: Learning with localized receptive fields. In: Proceedings of the 1988 Connectionist Models Summer School, Morgan-Kaufmann, Publishers, 1988

  15. Park, J., Sanberg, I.W.: Approximation and radial-basisfunction networks. Neural Comput. 5, 305–316 (1993)

    Article  Google Scholar 

  16. Park, J., Sanberg, I.W.: Universal approximation using radial-basis-function networks. Neural Comput. 2, 246–257 (1991)

    Article  Google Scholar 

  17. Pinkus, A.: Approximation theory of the MLP model in neural networks. Acta Numer 8, 143–195 (1999)

    Article  MathSciNet  Google Scholar 

  18. Schwartz, L.: Theorie generale des fonctions moyenne-periodiques. Ann. Math. 48, 857–928 (1947)

    Article  MathSciNet  Google Scholar 

  19. Sproston, J.P., Strauss, D.: Sums of subalgebras of \(C(X)\). J. London Math. Soc. 45, 265–278 (1992)

    Article  MathSciNet  Google Scholar 

  20. Sternfeld, Y.: Uniformly separating families of functions. Israel J. Math. 29, 61–91 (1978)

    Article  MathSciNet  Google Scholar 

  21. Vitushkin, A.G., Henkin, G.M.: Linear superpositions of functions. (Russian), Uspehi Mat. Nauk. 22, 77–124. (1967)

  22. Wang, H., Liu, K., Wu, Y., Wang, S., Zhang, Z., Li, F., Yao, J.: Image reconstruction for electrical impedance tomography using radial basis function neural network based on hybrid particle swarm optimization algorithm. IEEE Sens. J. 21(2), 1926–1934 (2020)

    Article  Google Scholar 

  23. Wang, R., Li, D., Miao, K.: Optimized radial basis function neural network based intelligent control algorithm of unmanned surface vehicles. Journal of Marine Science and Engineering. 8(3), 210 (2020)

    Article  Google Scholar 

  24. Wu, Y., Wang, H., Zhang, B., Du, K.L.: Using radial basis function networks for function approximation and classification. ISRN Appl. Math 2012, 1–34 (2012)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the anonymous reviewers for their insightful comments and suggestions, which helped to improve the quality of the paper.

Funding

No funding was received for this study.

Author information

Authors and Affiliations

Authors

Contributions

Aysu Ismayilova wrote Section 2. Muhammad Ismayilov wrote Section 3. Both authors wrote Section 1 and reviewed the manuscript.

Corresponding author

Correspondence to Muhammad Ismayilov.

Ethics declarations

Information on conflict of interest

The authors declare that they have no conflicts of interest.

Compliance with ethical standards

This article does not contain any studies involving animals performed by any of the authors. This article does not contain any studies involving human participants performed by any of the authors.

Competing interest

We declare that we have no competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ismayilova, A., Ismayilov, M. On the universal approximation property of radial basis function neural networks. Ann Math Artif Intell 92, 691–701 (2024). https://doi.org/10.1007/s10472-023-09901-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10472-023-09901-x

Keywords

Mathematics Subject Classification (2010)

Navigation