Abstract
The problem addressed in this paper concerns the ensembling generation for evidential k-nearest-neighbour classifier. An efficient method based on particle swarm optimization (PSO) is here proposed. We improve the performance of the evidential k-nearest-neighbour (EkNN) classifier using a random subspace based ensembling method. Given a set of random subspace EkNN classifier, a PSO is used for obtaining the best parameters of the set of evidential k-nearest-neighbour classifiers, finally these classifiers are combined by the “vote rule”. The performance improvement with respect to the state-of-the-art approaches is validated through experiments with several benchmark datasets.
![](http://media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs00521-007-0162-2/MediaObjects/521_2007_162_Fig1_HTML.gif)
Similar content being viewed by others
Notes
parameter of the EkNN classifier.
These algorithms are already implemented as MATLAB code by the authors of the algorithm and can be downloaded from http://www.hds.utc.fr/tdenoeux/software.htm.
It is implemented as in PSO MATLAB TOOLBOX; it is available at http://psotoolbox.sourceforge.net.
References
Opitz D, Maclin R (1999) Popular ensemble methods: an empirical study. J Artif Intell Res 11:169–198
Kittler J (1998) On combining classifiers. IEEE Trans Pattern Anal Mach Intell 20(3):226–239
Altıncay H, Demirekler M (2000) An information theoretic framework for weight estimation in the combination of probabilistic classifiers for speaker identification. Speech Commun 30(4):255–272
Whitaker CJ, Kuncheva LI (2003) Examining the relationship between majority vote accuracy and diversity in bagging and boosting, Technical Report, School of Informatics, University of Wales, Bangor
Zenobi G, Cunningham P (2001) Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error. In: Raedt LD, Flach PA (eds) Proceedings of the 12th conference on machine learning, Lecture notes in computer science 2167, pp 576–587
Melville P, Mooney RJ (2003) Constructing diverse classifier ensembles using artificial training examples. In: Proceedings of the IJCAI, pp 505–510
Zhou Z, Yu Y (2005) Ensembling local learners through multimodal perturbation. IEEE Trans Syst Man Cybern B Cybern 35(4):725–735
Breiman L (1996) Bagging predictors. Mach Learn 24:123–140
Schapire RE (2002) The boosting approach to machine learning: an overview. In: MSRI workshop on nonlinear estimation and classification, Berkeley
Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844
Whitley D (1994) A genetic algorithm tutorial. Stat Comput 4:65–85
Gen M, Cheng R (1997) Genetic algorithms and engineering design. Wiley, New York
Leardi R (1994) Application of a genetic algorithm to feature selection under full validation conditions and to outlier detection. J Chemom 8:65–79
Siedlecki W, Sklansky J (1989) A note on genetic algorithms for large scale feature selection. Pattern Recognit Lett 10:335–347
Zhang GP (2000) Neural networks for classification: a survey. IEEE Trans Syst Man Cybern C Appl Rev 30(4):451–462
Kennedy J, Eberhart RC (1995a) Particle swarm optimization. In: Proceedings of IEEE international conference on neural networks, Perth, pp 1942–1948
Kennedy J, Eberhart RC (1995b) A new optimizer using particle swarm theory. In: Sixth international symposium on micro machine and human science, Nagoya, pp 39–43
Kennedy J, Spears WM (1998) Matching algorithms to problems: an experimental test of the particle swarm and some genetic algorithms on the multimodal problem generator. In: Proceedings of the IEEE international conference on evolutionary computation, pp 39–43
Wang X, Yang J, Teng X, **a W, Jensen R (2006) Feature selection based on rough sets and particle swarm optimization. Pattern Recognit Lett (to appear)
Altıncay H (2006) Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbatio. Appl Soft Comput (to appear)
Denoeux T (1995) A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans Syst Man Cybern 25(05):804–813
Zouhal LM, Denoeux T (1998) An evidence-theoretic k-NN rule with parameter optimization. IEEE Trans Syst Man Cybern C 28(2):263–271
Gabrys G, Ruta D (2006) Genetic algorithms in classifier fusion. Appl Soft Comput 6(4):337–347
Yu E, Cho S (2006) Ensemble based on GA wrapper feature selection. Comput Industr Eng 51:111–116
Nanni L, Lumini A (2006) Particle swarm optimization for prototype reduction. IEEE Trans Circuits Syst (submitted)
Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30
Kuncheva L (2004) Combining pattern classifiers. Wiley, New York
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Nanni, L., Lumini, A. Particle swarm optimization for ensembling generation for evidential k-nearest-neighbour classifier. Neural Comput & Applic 18, 105–108 (2009). https://doi.org/10.1007/s00521-007-0162-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-007-0162-2