Log in

On the Choice of Regression Basis Functions and Machine Learning

  • MATHEMATICS
  • Published:
Vestnik St. Petersburg University, Mathematics Aims and scope Submit manuscript

Abstract

As is known, regression-analysis tools are widely used in machine-learning problems to establish the relationship between the observed variables and to store information in a compact manner. Most often, a regression function is described by a linear combination of some given functions fj(X),  j = 1, …, m, XDRs. If the observed data contain a random error, then the regression function reconstructed from the observations contains a random error and a systematic error depending on the selected functions  fj. This article indicates the possibility of an optimal, in the sense of a given functional metric, choice of  fj, if it is known that the true dependence obeys some functional equation. In some cases (a regular grid, s ≤ 2), close results can be obtained using a technique for random-process analysis. The numerical examples given in this work illustrate significantly broader opportunities for the assumed approach to regression problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.

Similar content being viewed by others

REFERENCES

  1. N. Draper and H. Smith, Applied Regression Analysis, 3rd ed. (Wiley, New York, 1998; Dialektika, Kyiv, 2016).

  2. I. Ts. Gokhberg and M. G. Kreyn, Introduction to the Theory of Linear Non-Self-Adjoint Operators in a Hilbert Space (Nauka, Moscow, 1965; American Mathematical Society, Providence, R.I., 1969), in Ser.: Translations of Mathematical Monographs, Vol. 18.

  3. V. I. Donskoi, “Machine learning and learnability: Comparative survey,” Available at Intellectual Archive, paper no. 933 (2012).

  4. K. D. Usevich, “Decomposition of functions in 2D-extension of SSA and related partial differential systems of equations,” Vestn. S.-Peterb. Univ., Ser. 10: Prikl. Mat., Inf., Protsessy Upr., No. 3, 151–160 (2009).

  5. S. M. Ermakov and L. Yu. Kotova, “On the choice of basic functions in regression analysis,” in Collection of Works of the Department of Statistical Modeling of St. Petersburg State University (1999), pp. 3–43 [in Russian].

  6. A. A. Samarskii, The Theory of Difference Schemes (Moscow, Nauka, 1989) [in Russian].

    Google Scholar 

  7. N. E. Golyandina and K. D. Usevich, “2D-SSA Method for analysis of two-dimensional fields,” in System Identification and Control Problems: Proc. 7th Int. Conf. (SICPRO’08), Moscow, 2008 (Inst. Probl. Upr. im. V. A. Trapeznikova Ross. Akad. Nauk, Moscow, 2008), pp. 1657–1727.

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to S. M. Ermakov or S. N. Leora.

Ethics declarations

CONFLICT OF INTEREST

We declare that we have no conflicts of interest.

ADDITIONAL INFORMATION

Cite this work: Ermakov S.M. and Leora S.N., On the choice of regression basis functions and machine learning,  Vestn. S.-Peterb. Univ., Ser. 1: Mat., Mekh., Astron., 2022, vol. 9 (67), no. 1, pp. 11–22 (in Russian). https://doi.org/10.21638/spbu01.2022.102

Additional information

Translated by O. Pismenov

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ermakov, S.M., Leora, S.N. On the Choice of Regression Basis Functions and Machine Learning. Vestnik St.Petersb. Univ.Math. 55, 7–15 (2022). https://doi.org/10.1134/S1063454122010034

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S1063454122010034

Keywords:

Navigation