Abstract
In classifier combining, one tries to fuse the information that is given by a set of base classifiers. In such a process, one of the difficulties is how to deal with the variability between classifiers. Although various measures and many combining rules have been suggested in the past, the problem of constructing optimal combiners is still heavily studied.
In this paper, we discuss and illustrate the possibilities of classifier embedding in order to analyse the variability of base classifiers, as well as their combining rules. Thereby, a space is constructed in which classifiers can be represented as points. Such a space of a low dimensionality is a Classifier Projection Space (CPS). In the first instance, it is used to design a visual tool that gives more insight into the differences of various combining techniques. This is illustrated by some examples. In the end, we discuss how the CPS may also be used as a basis for constructing new combining rules.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
C.L. Blake and C.J. Merz. UCI repository of machine learning databases, 1998.
L. Breiman. Bagging predictors. Machine Learning, 24(2):123–140, 1996.
D. Cho and D.J. Miller. A Low-complexity Multidimensional Scaling Method Based on Clustering. concept paper, 2002.
T.F. Cox and M.A.A. Cox. Multidimensional Scaling. Chapman & Hall, 1995.
Y. Freund and R. E. Schapire. Experiments with a new boosting algorithm. In Machine Learning: Proc. of the 13th International Conference, pages 148–156, 1996.
L. Goldfarb. A new approach to pattern recognition. In L.N. Kanal and A. Rosenfeld, editors, Progress in Pattern Recognition, volume 2, pages 241–402. Elsevier Science Publishers B.V., 1985.
T.K. Ho. The random subspace method for constructing decision forests. IEEE Trans. on PAMI, 20(8):832–844, 1998.
J. Kittler, M. Hatef, R.P.W. Duin, and J. Matas. On combining classifiers. IEEE Trans. on PAMI, 20(3):226–239, 1998.
L.I. Kuncheva and C.J. Whitaker. Measures of diversity in classifier ensembles. submitted, 2002.
C. Lai, D.M.J. Tax, R.p.W. Duin, P. Paclík, and E. Pękalska. On combining one-class classifiers for image database retrieval. In International Workshop on Multiple Classifier Systems, Cagliari, Sardinia, 2002.
L. Lam. Classifier combinations: implementation and theoretical issues. In Multiple Classifier Systems, LNCS, volume 1857, pages 78–86, 2000.
MFEAT: ftp://ftp.ics.uci.edu/pub/machine-learning-databases/mfeat/.
E. Pękalska and R.P.W. Duin. Spatial representation of dissimilarity data via lower-complexity linear and nonlinear map**s. In Joint International Workshop on SSPR and SPR, Windsor, Canada, 2002.
E. Pękalska, P. Paclík, and R.P.W. Duin. A Generalized Kernel Approach to Dissimilarity Based Classification. Journal of Mach. Learn. Research, 2:175–211, 2001.
M. Skurichina. Stabilizing Weak Classifiers. PhD thesis, Delft University of Technology, Delft, The Netherlands, 2001.
D.J.M. Tax. One-class classifiers. PhD thesis, Delft University of Technology, Delft, The Netherlands, 2001.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Pękalska, E., Duin, R.P.W., Skurichina, M. (2002). A Discussion on the Classifier Projection Space for Classifier Combining. In: Roli, F., Kittler, J. (eds) Multiple Classifier Systems. MCS 2002. Lecture Notes in Computer Science, vol 2364. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45428-4_14
Download citation
DOI: https://doi.org/10.1007/3-540-45428-4_14
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43818-2
Online ISBN: 978-3-540-45428-1
eBook Packages: Springer Book Archive