Log in

Particle swarm optimization for ensembling generation for evidential k-nearest-neighbour classifier

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

The problem addressed in this paper concerns the ensembling generation for evidential k-nearest-neighbour classifier. An efficient method based on particle swarm optimization (PSO) is here proposed. We improve the performance of the evidential k-nearest-neighbour (EkNN) classifier using a random subspace based ensembling method. Given a set of random subspace EkNN classifier, a PSO is used for obtaining the best parameters of the set of evidential k-nearest-neighbour classifiers, finally these classifiers are combined by the “vote rule”. The performance improvement with respect to the state-of-the-art approaches is validated through experiments with several benchmark datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. parameter of the EkNN classifier.

  2. These algorithms are already implemented as MATLAB code by the authors of the algorithm and can be downloaded from http://www.hds.utc.fr/tdenoeux/software.htm.

  3. It is implemented as in PSO MATLAB TOOLBOX; it is available at http://psotoolbox.sourceforge.net.

  4. http://www.ics.uci.edu/mlearn/MLRepository.html.

References

  1. Opitz D, Maclin R (1999) Popular ensemble methods: an empirical study. J Artif Intell Res 11:169–198

    MATH  Google Scholar 

  2. Kittler J (1998) On combining classifiers. IEEE Trans Pattern Anal Mach Intell 20(3):226–239

    Article  Google Scholar 

  3. Altıncay H, Demirekler M (2000) An information theoretic framework for weight estimation in the combination of probabilistic classifiers for speaker identification. Speech Commun 30(4):255–272

    Article  Google Scholar 

  4. Whitaker CJ, Kuncheva LI (2003) Examining the relationship between majority vote accuracy and diversity in bagging and boosting, Technical Report, School of Informatics, University of Wales, Bangor

  5. Zenobi G, Cunningham P (2001) Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error. In: Raedt LD, Flach PA (eds) Proceedings of the 12th conference on machine learning, Lecture notes in computer science 2167, pp 576–587

  6. Melville P, Mooney RJ (2003) Constructing diverse classifier ensembles using artificial training examples. In: Proceedings of the IJCAI, pp 505–510

  7. Zhou Z, Yu Y (2005) Ensembling local learners through multimodal perturbation. IEEE Trans Syst Man Cybern B Cybern 35(4):725–735

    Article  Google Scholar 

  8. Breiman L (1996) Bagging predictors. Mach Learn 24:123–140

    MATH  MathSciNet  Google Scholar 

  9. Schapire RE (2002) The boosting approach to machine learning: an overview. In: MSRI workshop on nonlinear estimation and classification, Berkeley

  10. Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844

    Article  Google Scholar 

  11. Whitley D (1994) A genetic algorithm tutorial. Stat Comput 4:65–85

    Article  Google Scholar 

  12. Gen M, Cheng R (1997) Genetic algorithms and engineering design. Wiley, New York

  13. Leardi R (1994) Application of a genetic algorithm to feature selection under full validation conditions and to outlier detection. J Chemom 8:65–79

    Article  Google Scholar 

  14. Siedlecki W, Sklansky J (1989) A note on genetic algorithms for large scale feature selection. Pattern Recognit Lett 10:335–347

    Article  MATH  Google Scholar 

  15. Zhang GP (2000) Neural networks for classification: a survey. IEEE Trans Syst Man Cybern C Appl Rev 30(4):451–462

    Article  Google Scholar 

  16. Kennedy J, Eberhart RC (1995a) Particle swarm optimization. In: Proceedings of IEEE international conference on neural networks, Perth, pp 1942–1948

  17. Kennedy J, Eberhart RC (1995b) A new optimizer using particle swarm theory. In: Sixth international symposium on micro machine and human science, Nagoya, pp 39–43

  18. Kennedy J, Spears WM (1998) Matching algorithms to problems: an experimental test of the particle swarm and some genetic algorithms on the multimodal problem generator. In: Proceedings of the IEEE international conference on evolutionary computation, pp 39–43

  19. Wang X, Yang J, Teng X, **a W, Jensen R (2006) Feature selection based on rough sets and particle swarm optimization. Pattern Recognit Lett (to appear)

  20. Altıncay H (2006) Ensembling evidential k-nearest neighbor classifiers through multi-modal perturbatio. Appl Soft Comput (to appear)

  21. Denoeux T (1995) A k-nearest neighbor classification rule based on Dempster-Shafer theory. IEEE Trans Syst Man Cybern 25(05):804–813

    Article  Google Scholar 

  22. Zouhal LM, Denoeux T (1998) An evidence-theoretic k-NN rule with parameter optimization. IEEE Trans Syst Man Cybern C 28(2):263–271

    Article  Google Scholar 

  23. Gabrys G, Ruta D (2006) Genetic algorithms in classifier fusion. Appl Soft Comput 6(4):337–347

    Article  Google Scholar 

  24. Yu E, Cho S (2006) Ensemble based on GA wrapper feature selection. Comput Industr Eng 51:111–116

    Article  Google Scholar 

  25. Nanni L, Lumini A (2006) Particle swarm optimization for prototype reduction. IEEE Trans Circuits Syst (submitted)

  26. Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  Google Scholar 

  27. Kuncheva L (2004) Combining pattern classifiers. Wiley, New York

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Loris Nanni.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Nanni, L., Lumini, A. Particle swarm optimization for ensembling generation for evidential k-nearest-neighbour classifier. Neural Comput & Applic 18, 105–108 (2009). https://doi.org/10.1007/s00521-007-0162-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-007-0162-2

Keywords

Navigation