Log in

Frequency component Kernel for SVM

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Finding a proper kernel for Support vector machine and adjusting the involved parameters for a better classification remain immense challenges. This paper addresses both challenges in two parts. In part one, a new kernel, called Frequency Component Kernel, is presented; and in the second part, a couple of techniques to form objective functions are introduced to estimate its shape parameter. In designing the FCK, a new Frequency-Based Regressor Matrix is designed based on data structure discovery through curve fitting. The inner product of this regressor matrix with itself produces an intermediary kernel. FCK is a smoothed version of this intermediary kernel. The FCK’s classification accuracy with a 95% confidence interval is compared to well-known kernels, namely Gaussian, Linear, Polynomial, and Sigmoid kernels, for fifteen sets of data. A grid search method is employed for parameter assignments in all kernels. This comparison shows the superiority of FCK in most cases. In part two, the first technique to form an objective function is based on variances of data groups, distances between the centers of data groups, and upper bound classification errors; and the second technique is based on distances between all data, SVM margin, and distance between the centers of data groups. Both techniques take advantage of the FCK development so that all data are converted to the new space via the FBRM. Then, the data distances in this space are calculated. The comparative results show that both suggested techniques to form objective functions outperform the current state-of-the-art parameter estimation methods. The inclusive results show that the combination of our FCK with our two automatic shape parameter estimation methods, could be used as a superlative choice in many related SVM usages and applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Thailand)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Burges CJ (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Discov 2(2):121–167

    Article  Google Scholar 

  2. Vapnik V (1998) Statistical learning theory. John Willey & Sons, New York

    MATH  Google Scholar 

  3. Gupta D, Richhariya B, Borah P (2019) A fuzzy twin support vector machine based on information entropy for class imbalance learning. Neural Comput Appl 31(11):7153–7164

    Article  Google Scholar 

  4. Lancaster P, Salkauskas K (1986) Curve and surface fitting: an introduction. Academic press, New York

    MATH  Google Scholar 

  5. Bishop CM (2006) Pattern recognition and machine learning. Springer, Cham

    MATH  Google Scholar 

  6. Dokmanić I, Gribonval RJAPA (2017) Beyond moore-penrose part i: generalized inverses that minimize matrix norms

  7. Wasylkiwskyj W (2013) Signals and transforms in linear systems analysis. Springer, Cham

    Book  MATH  Google Scholar 

  8. Papoulis A (1962) The Fourier integral and its applications. McGraw-Hill, New York

    MATH  Google Scholar 

  9. Hsu CW, Lin CJ (2002) A comparison of methods for multiclass support vector machines. IEEE Trans Neural Netw 13(2):415–425

    Article  Google Scholar 

  10. Dua DAG (2017) Casey, {UCI} Machine Learning Repository. [Online]. Available: http://archive.ics.uci.edu/ml

  11. Chang CC, Lin CJ (2011) LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol (TIST) 2(3):1–27

    Article  Google Scholar 

  12. Anguita D, Ghelardoni L, Ghio A, Oneto L, Ridella S (2012) The 'K' in K-fold cross validation. In: ESANN

  13. Arlot S, Celisse A (2010) A survey of cross-validation procedures for model selection. Stat Surv 4:40–79

    Article  MathSciNet  MATH  Google Scholar 

  14. Alpaydin E (2020) Introduction to machine learning. MIT press, New York

    MATH  Google Scholar 

  15. Chapelle O, Vapnik V, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1):131–159

    Article  MATH  Google Scholar 

  16. Liu Z, Xu H (2014) Kernel parameter selection for support vector machine classification. J Algorithms Comput Technol 8(2):163–177

    Article  MathSciNet  MATH  Google Scholar 

  17. Chou CH, Su MC, Lai E (2004) A new cluster validity measure and its application to image compression. Pattern Anal Appl 7(2):205–220

    Article  MathSciNet  Google Scholar 

  18. Das S, Abraham A, Konar A (2007) Automatic clustering using an improved differential evolution algorithm. IEEE Trans Syst Man Cybern Part A Syst Hum 38(1):218–237

    Article  Google Scholar 

  19. Boyd S, Vandenberghe L (2004) Convex optimization. Cambridge University Press, Cambridge

    Book  MATH  Google Scholar 

  20. Nocedal J, Wright S (2006) Numerical optimization. Springer Science & Business Media, Cham

    MATH  Google Scholar 

  21. Floudas CA, Pardalos PM (2014) Recent advances in global optimization. Princeton University Press, Princeton

    Google Scholar 

  22. Floudas CA, Pardalos PM (2001) Encyclopedia of optimization. Springer Science & Business Media, Cham

    Book  MATH  Google Scholar 

  23. Goldberg DE, Holland JHJMl (1988) Genetic algorithms and machine learning. 3(2):95–99

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ahmad R. Naghsh-Nilchi.

Ethics declarations

Conflict of interest

Please note that there is no conflict of interests over this article, to the best of our knowledge.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Esteki, S., Naghsh-Nilchi, A.R. Frequency component Kernel for SVM. Neural Comput & Applic 34, 22449–22464 (2022). https://doi.org/10.1007/s00521-022-07632-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-07632-4

Keywords

Navigation