Abstract
In machine learning, Ensemble Learning methodologies are known to improve predictive accuracy and robustness. They consist in the learning of many classifiers that produce outputs which are finally combined according to different techniques. Bagging, or Bootstrap Aggregating, is one of the most famous Ensemble methodologies and is usually applied to the same classification base algorithm, i.e. the same type of classifier is learnt multiple times on bootstrapped versions of the initial learning dataset. In this paper, we propose a bagging methodology that involves different types of classifier. Classifiers’ probabilist outputs are used to build mass functions which are further combined within the belief functions framework. Three different ways of building mass functions are proposed; preliminary experiments on benchmark datasets showing the relevancy of the approach are presented.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Wolpert, D.H.: The supervised learning no-free-lunch theorems. In: Roy, R., Köppen, M., Ovaska, S., Furuhashi, T., Hoffmann, F. (eds.) Soft Computing and Industry: Recent Applications, pp. 25–42. Springer, London (2002). https://doi.org/10.1007/978-1-4471-0123-9_3
Qu, G., Wu, H.: Bucket learning: improving model quality through enhancing local patterns. Knowl.-Based Syst. 27, 51–59 (2012)
Zhou, Z.H.: Ensemble Methods: Foundations and Algorithms, 1st edn. Chapman & Hall/CRC, Boca Raton (2012)
Polikar, R.: Ensemble based systems in decision making. IEEE Circ. Syst. Mag. 6(3), 21–45 (2006)
Džeroski, S., Ženko, B.: Is combining classifiers with stacking better than selecting the best one? Mach. Learn. 54(3), 255–273 (2004)
Vannoorenberghe, P.: On aggregating belief decision trees. Inf. Fus. 5(3), 179–188 (2004)
Xu, P., Davoine, F., Zha, H., Denœux, T.: Evidential calibration of binary SVM classifiers. Int. J. Approx. Reason. 72, 55–70 (2016)
Ma, L., Sun, B., Li, Z.: Bagging likelihood-based belief decision trees. In: 2017 20th International Conference on Information Fusion (Fusion), pp. 1–6 (2017)
Dempster, A.P.: Upper and lower probabilities induced by a multivalued map**. Ann. Math. Stat. 38, 325–339 (1967)
Shafer, G.: A Mathematical Theory of Evidence. Princeton University Press, Princeton (1976)
Denoeux, T., El Zoghby, N., Cherfaoui, V., Jouglet, A.: Optimal object association in the Dempster-Shafer framework. IEEE Trans. Cybern. 44(22), 2521–2531 (2014)
Efron, B.: Bootstrap methods: another look at the jackknife. Ann. Stat. 7, 1–26 (1979)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55, 119–139 (1997)
Schapire, R.E.: Explaining adaboost. In: Schölkopf, B., Luo, Z., Vovk, V. (eds.) Empirical Inference, pp. 37–52. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-41136-6_5
Wolpert, D.H.: Stacked generalization. Neural Netw. 5(2), 241–259 (1992)
Breiman, L.: Random forests. Mach. Learn. 45, 5–32 (2001)
Cortes, C., Mohri, M., Riley, M., Rostamizadeh, A.: Sample selection bias correction theory. In: Freund, Y., Györfi, L., Turán, G., Zeugmann, T. (eds.) ALT 2008. LNCS (LNAI), vol. 5254, pp. 38–53. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-87987-9_8
Duin, R.P.W.: The combining classifier: to train or not to train? In: Object Recognition Supported by User Interaction for Service Robots vol. 2, pp. 765–770 (2002)
Smets, P., Kennes, R.: The transferable belief model. Artif. Intell. 66(2), 191–234 (1994)
Yager, R.R.: On the dempster-shafer framework and new combination rules. Inf. Sci. 41(2), 93–137 (1987)
Smets, P.: Belief functions: the disjunctive rule of combination and the generalized Bayesian. Int. J. Approx. Reason. 9(1), 1–32 (2005)
Dubois, D., Prade, H.: Representation and combination of uncertainty with belief functions and possibility measures. Comput. Intell. 4(3), 244–264 (1988)
Florea, M.C., Dezert, J., Valin, P., Smarandache, F., Jousselme, A.: Adaptative combination rule and proportional conflict redistribution rule for information fusion. CoRR abs/cs/0604042 (2006)
François, J., Grandvalet, Y., Denceux, T., Roger, J.M. In: Bagging improves uncertainty representation in evidential pattern classification. Physica-Verlag HD, pp. 295–308 (2002)
Xu, P., Davoine, F., Denoeux, T.: Evidential combination of pedestrian detectors. In: British Machine Vision Conference, pp. 1–14. Nottingham (2014)
Denœux, T.: Maximum likelihood estimation from uncertain data in the belief function framework. IEEE Trans. Knowl. Data Eng. 25, 119–130 (2011)
Sutton-Charani, N., Destercke, S., Denœux, T.: Learning decision trees from uncertain data with an evidential EM approach. In: International Conference on Machine Learning and Applications (ICMLA) (2013)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Sutton-Charani, N., Imoussaten, A., Harispe, S., Montmain, J. (2018). Evidential Bagging: Combining Heterogeneous Classifiers in the Belief Functions Framework. In: Medina, J., et al. Information Processing and Management of Uncertainty in Knowledge-Based Systems. Theory and Foundations. IPMU 2018. Communications in Computer and Information Science, vol 853. Springer, Cham. https://doi.org/10.1007/978-3-319-91473-2_26
Download citation
DOI: https://doi.org/10.1007/978-3-319-91473-2_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-91472-5
Online ISBN: 978-3-319-91473-2
eBook Packages: Computer ScienceComputer Science (R0)