Feature Selection Based on Kernel Discriminant Analysis

  • Conference paper
Artificial Neural Networks – ICANN 2006 (ICANN 2006)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4132))

Included in the following conference series:

Abstract

For two-class problems we propose two feature selection criteria based on kernel discriminant analysis. The first one is the objective function of kernel discriminant analysis (KDA) and the second one is the KDA-based exception ratio. We show that the objective function of KDA is monotonic for the deletion of features, which ensures stable feature selection. The KDA-based exception ratio defines the overlap between classes in the one-dimensional space obtained by KDA. The computer experiments show that the both criteria work well to select features but the former is more stable.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Abe, S.: Support Vector Machines for Pattern Classification. Springer, Heidelberg (2005)

    Google Scholar 

  2. Guyon, I., Weston, J., Barnhill, S., Vapnik, V.: Gene selection for cancer classification using support vector machines. Machine Learning 46(1–3), 389–422 (2002)

    Article  MATH  Google Scholar 

  3. Bradley, P.S., Mangasarian, O.L.: Feature selection via concave minimization and support vector machines. In: Proc. ICML 1998, pp. 82–90 (1998)

    Google Scholar 

  4. Weston, J., Elisseeff, A., Schölkopf, B., Tip**, M.: Use of the zero-norm with linear models and kernel methods. J. Machine Learning Research 3, 1439–1461 (2003)

    Article  MATH  Google Scholar 

  5. Perkins, S., Lacker, K., Theiler, J.: Grafting: Fast, incremental feature selection by gradient descent in function space. J. Machine Learning Research 3, 1333–1356 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  6. Liu, Y., Zheng, Y.F.: FS_SFS: A novel feature selection method for support vector machines. Pattern Recognition (to appear)

    Google Scholar 

  7. Abe, S.: Modified backward feature selection by cross validation. In: Proc. ESANN 2005, pp. 163–168 (2005)

    Google Scholar 

  8. Bi, J., Bennett, K.P., Embrechts, M., Breneman, C., Song, M.: Dimensionality reduction via sparse support vector machines. J. Machine Learning Research 3, 1229–1243 (2003)

    Article  MATH  Google Scholar 

  9. Rakotomamonjy, A.: Variable selection using SVM-based criteria. J. Machine Learning Research 3, 1357–1370 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  10. Thawonmas, R., Abe, S.: A novel approach to feature selection based on analysis of class regions. IEEE Trans. SMC–B 27(2), 196–207 (1997)

    Google Scholar 

  11. Mika, S., Rätsch, G., Weston, J., Schölkopf, B., Müller, K.-R.: Fisher discriminant analysis with kernels. In: NNSP 1999, pp. 41–48 (1999)

    Google Scholar 

  12. Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural Computation 12(10), 2385–2404 (2000)

    Article  Google Scholar 

  13. Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2002)

    Google Scholar 

  14. Kita, S., Maekawa, S., Ozawa, S., Abe, S.: Boosting kernel discriminant analysis with adaptive kernel selection. In: Proc. ICANCA 2005, CD-ROM (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ashihara, M., Abe, S. (2006). Feature Selection Based on Kernel Discriminant Analysis. In: Kollias, S., Stafylopatis, A., Duch, W., Oja, E. (eds) Artificial Neural Networks – ICANN 2006. ICANN 2006. Lecture Notes in Computer Science, vol 4132. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11840930_29

Download citation

  • DOI: https://doi.org/10.1007/11840930_29

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-38871-5

  • Online ISBN: 978-3-540-38873-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation