Abstract
Finding a proper kernel for Support vector machine and adjusting the involved parameters for a better classification remain immense challenges. This paper addresses both challenges in two parts. In part one, a new kernel, called Frequency Component Kernel, is presented; and in the second part, a couple of techniques to form objective functions are introduced to estimate its shape parameter. In designing the FCK, a new Frequency-Based Regressor Matrix is designed based on data structure discovery through curve fitting. The inner product of this regressor matrix with itself produces an intermediary kernel. FCK is a smoothed version of this intermediary kernel. The FCK’s classification accuracy with a 95% confidence interval is compared to well-known kernels, namely Gaussian, Linear, Polynomial, and Sigmoid kernels, for fifteen sets of data. A grid search method is employed for parameter assignments in all kernels. This comparison shows the superiority of FCK in most cases. In part two, the first technique to form an objective function is based on variances of data groups, distances between the centers of data groups, and upper bound classification errors; and the second technique is based on distances between all data, SVM margin, and distance between the centers of data groups. Both techniques take advantage of the FCK development so that all data are converted to the new space via the FBRM. Then, the data distances in this space are calculated. The comparative results show that both suggested techniques to form objective functions outperform the current state-of-the-art parameter estimation methods. The inclusive results show that the combination of our FCK with our two automatic shape parameter estimation methods, could be used as a superlative choice in many related SVM usages and applications.
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00521-022-07632-4/MediaObjects/521_2022_7632_Fig1_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00521-022-07632-4/MediaObjects/521_2022_7632_Fig2_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00521-022-07632-4/MediaObjects/521_2022_7632_Fig3_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00521-022-07632-4/MediaObjects/521_2022_7632_Fig4_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00521-022-07632-4/MediaObjects/521_2022_7632_Fig5_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00521-022-07632-4/MediaObjects/521_2022_7632_Fig6_HTML.png)
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00521-022-07632-4/MediaObjects/521_2022_7632_Fig7_HTML.png)
Similar content being viewed by others
References
Burges CJ (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Discov 2(2):121–167
Vapnik V (1998) Statistical learning theory. John Willey & Sons, New York
Gupta D, Richhariya B, Borah P (2019) A fuzzy twin support vector machine based on information entropy for class imbalance learning. Neural Comput Appl 31(11):7153–7164
Lancaster P, Salkauskas K (1986) Curve and surface fitting: an introduction. Academic press, New York
Bishop CM (2006) Pattern recognition and machine learning. Springer, Cham
Dokmanić I, Gribonval RJAPA (2017) Beyond moore-penrose part i: generalized inverses that minimize matrix norms
Wasylkiwskyj W (2013) Signals and transforms in linear systems analysis. Springer, Cham
Papoulis A (1962) The Fourier integral and its applications. McGraw-Hill, New York
Hsu CW, Lin CJ (2002) A comparison of methods for multiclass support vector machines. IEEE Trans Neural Netw 13(2):415–425
Dua DAG (2017) Casey, {UCI} Machine Learning Repository. [Online]. Available: http://archive.ics.uci.edu/ml
Chang CC, Lin CJ (2011) LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol (TIST) 2(3):1–27
Anguita D, Ghelardoni L, Ghio A, Oneto L, Ridella S (2012) The 'K' in K-fold cross validation. In: ESANN
Arlot S, Celisse A (2010) A survey of cross-validation procedures for model selection. Stat Surv 4:40–79
Alpaydin E (2020) Introduction to machine learning. MIT press, New York
Chapelle O, Vapnik V, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1):131–159
Liu Z, Xu H (2014) Kernel parameter selection for support vector machine classification. J Algorithms Comput Technol 8(2):163–177
Chou CH, Su MC, Lai E (2004) A new cluster validity measure and its application to image compression. Pattern Anal Appl 7(2):205–220
Das S, Abraham A, Konar A (2007) Automatic clustering using an improved differential evolution algorithm. IEEE Trans Syst Man Cybern Part A Syst Hum 38(1):218–237
Boyd S, Vandenberghe L (2004) Convex optimization. Cambridge University Press, Cambridge
Nocedal J, Wright S (2006) Numerical optimization. Springer Science & Business Media, Cham
Floudas CA, Pardalos PM (2014) Recent advances in global optimization. Princeton University Press, Princeton
Floudas CA, Pardalos PM (2001) Encyclopedia of optimization. Springer Science & Business Media, Cham
Goldberg DE, Holland JHJMl (1988) Genetic algorithms and machine learning. 3(2):95–99
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
Please note that there is no conflict of interests over this article, to the best of our knowledge.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Esteki, S., Naghsh-Nilchi, A.R. Frequency component Kernel for SVM. Neural Comput & Applic 34, 22449–22464 (2022). https://doi.org/10.1007/s00521-022-07632-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-022-07632-4