Inheritance vs. Expansion: Generalization Degree of Nearest Neighbor Rule in Continuous Space as Covering Operator of XCS

  • Conference paper
  • First Online:
Applications of Evolutionary Computation (EvoApplications 2022)

Abstract

This paper focuses on the covering mechanism which generates a new if-then rule when the input data does not match the rules in the XCS Classifier System (XCS), a rule-based machine learning system, and discusses how the new rule should be generated from the viewpoint of “inheritance” and “expansion” of the generalization degree of the nearest neighbor rule in the continuous space. For this purpose, this paper proposes the two covering mechanisms based on the “inheritance” and “expansion” of the generalization degree of the nearest neighbor rule and compares their results by applying them to XCS for real-valued input spaces (XCSR). Through the intensive experiments on three types of problems with the different characteristics, the following implications have been revealed: (1) the new rules should be generated by inheriting the generalization degree of the nearest neighbor rule in comparison with expanding it in the continuous space; and (2) XCSR with the “inheritance” based covering mechanism achieves higher classification accuracy with fewer rules than the conventional XCSR, which achieves higher classification accuracy than XCSR with the “expansion” based covering mechanism.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
EUR 29.95
Price includes VAT (Germany)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 117.69
Price includes VAT (Germany)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
EUR 160.49
Price includes VAT (Germany)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    The experience exp, the time stamp ts, and the action set size as.

  2. 2.

    To implement XCSR-LCGE and XCSR-LCGI, replace the function LocalCovering of XCS-LCPCI in [16] with Algorithm 1.

  3. 3.

    For example, in the case of the bit sequence \(\boldsymbol{b}=11101010101\), \(d=(b_0b_1b_2)_{2}=(111)_{2}=7\), so the correct answer class is determined to be \(b_{k+d}=b_{3+7}=b_{10}=1\).

  4. 4.

    https://www.kaggle.com/torikul140129/paddy-leaf-images-aman(02.08.2022).

  5. 5.

    11-RMUX: \(N=20,000\), \(\alpha =0.1\), \(\beta =0.229242\), \(\delta =0.1\), \(\nu =5\), \(\theta _{mna}=2\), \(\theta _{GA}=12\), \(\theta _{del}=20\), \(\theta _{sub}=15\), \(\epsilon _0=109.918427\), \(\chi =0.8\), \(\mu =0.04\), \(p_I=0.01\), \(\epsilon _I=0.01\), \(F_I=0.01\), \(FitnessReduction=0.1\), \(m_0=0.1\), \(r_0=1.0\), \(doASSubsumption=yes\), \(doGASubsumption=yes\).

  6. 6.

    6-IRMUX: Analogous to 11-RMUX, except: \(N=2000\), \(\beta =0.030691\), \(\theta _{GA}=192\), \(\theta _{sub}=217\), \(\epsilon _0=9.085638\), \(doASSubsumption=no\).

  7. 7.

    Paddy Leaf: Analogous to 11-RMUX, except: \(N=6400\), \(\beta =0.2\), \(\theta _{mna}=4\), \(\theta _{GA}=48\), \(\theta _{del}=50\), \(\theta _{sub}=50\), \(\epsilon _0=1.0\), \(m_0=0.5\).

  8. 8.

    For example, in the case of the 11-RMUX problem, the generality of each rule in the optimal ruleset [O] [4] is uniformly \(0.5^{k+1}=0.5^4=0.0625\), regardless of the class. Similarly, in the case of 6-IRMUX, it is uniformly \(0.5^{k+1}=0.5^3=0.125\).

References

  1. Barry, A.M.: The stability of long action chains in XCS. Soft. Comput. 6(3–4), 183–199 (2002)

    Article  Google Scholar 

  2. Behdad, M., French, T., Barone, L., Bennamoun, M.: On principal component analysis for high-dimensional XCSR. Evol. Intel. 5(2), 129–138 (2012)

    Article  Google Scholar 

  3. Bernadó-Mansilla, E., Garrell-Guiu, J.M.: Accuracy-based learning classifier systems: models, analysis and applications to classification tasks. Evol. Comput. 11(3), 209–238 (2003)

    Article  Google Scholar 

  4. Butz, M.V., Kovacs, T., Lanzi, P.L., Wilson, S.W.: How XCS evolves accurate classifiers. In: Proceedings of the Third Genetic and Evolutionary Computation Conference (GECCO-2001), pp. 927–934. Citeseer (2001)

    Google Scholar 

  5. Butz, M.V., Sastry, K., Goldberg, D.E.: Tournament selection: stable fitness pressure in XCS. In: Cantú-Paz, E., et al. (eds.) GECCO 2003. LNCS, vol. 2724, pp. 1857–1869. Springer, Heidelberg (2003). https://doi.org/10.1007/3-540-45110-2_83

    Chapter  Google Scholar 

  6. Butz, M.V., Wilson, S.W.: An algorithmic description of XCS. Soft. Comput. 6(3–4), 144–153 (2002)

    Article  Google Scholar 

  7. Fredivianus, N., Prothmann, H., Schmeck, H.: XCS revisited: a novel discovery component for the eXtended classifier system. In: Deb, K., et al. (eds.) SEAL 2010. LNCS, vol. 6457, pp. 289–298. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-17298-4_30

    Chapter  Google Scholar 

  8. Goldberg, D.E.: Genetic Algorithms in Search, 1st edn. Optimization and Machine Learning. Addison-Wesley Longman Publishing Co., Inc, USA (1989)

    MATH  Google Scholar 

  9. Holland, J.H.: Esca** brittleness: The possibilities of general-purpose learning algorithms applied to parallel rule-based systems. Machine learning, an artificial intelligence approach 2, 593–623 (1986)

    Google Scholar 

  10. Kovacs, T.: Towards a theory of strong overgeneral classifiers. In: Foundations of Genetic Algorithms 6, pp. 165–184. Elsevier (2001)

    Google Scholar 

  11. Lanzi, P.L.: An analysis of generalization in the XCS classifier system. Evol. Comput. 7(2), 125–149 (1999)

    Article  Google Scholar 

  12. Nakata, M., Browne, W.N.: Learning optimality theory for accuracy-based learning classifier systems. IEEE Trans. Evol. Comput. 25(1), 61–74 (2020)

    Article  Google Scholar 

  13. Orriols-Puig, A., Bernadó-Mansilla, E.: Bounding XCS’s parameters for unbalanced datasets. In: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, pp. 1561–1568 (2006)

    Google Scholar 

  14. Stone, C., Bull, L.: For real! XCS with continuous-valued inputs. Evol. Comput. 11(3), 299–336 (2003)

    Google Scholar 

  15. Sutton, R.S.: Learning to predict by the methods of temporal differences. Mach. Learn. 3(1), 9–44 (1988)

    Google Scholar 

  16. Tadokoro, M., Hasegawa, S., Tatsumi, T., Sato, H., Takadama, K.: Local covering: adaptive rule generation method using existing rules for XCS. In: 2020 IEEE Congress on Evolutionary Computation (CEC), pp. 1–8. IEEE (2020)

    Google Scholar 

  17. Tadokoro, M., Sato, H., Takadama, K.: XCS with weight-based matching in VAE latent space and additional learning of high-dimensional data. In: 2021 IEEE Congress on Evolutionary Computation (CEC), pp. 304–310. IEEE (2021)

    Google Scholar 

  18. Venturini, G.: Adaptation in dynamic environments through a minimal probability of exploration. In: Proceedings of the Third International Conference on Simulation of Adaptive Behavior: from Animals to Animats 3, pp. 371–379 (1994)

    Google Scholar 

  19. Wada, A., Takadama, K., Shimohara, K., Katai, O.: Analyzing parameter sensitivity and classifier representations for real-valued XCS. In: Kovacs, T., Llorà, X., Takadama, K., Lanzi, P.L., Stolzmann, W., Wilson, S.W. (eds.) IWLCS 2003-2005. LNCS (LNAI), vol. 4399, pp. 1–16. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-71231-2_1

    Chapter  Google Scholar 

  20. Wagner, A.R.M., Stein, A.: On the effects of absumption for XCS with continuous-valued inputs. In: Castillo, P.A., Jiménez Laredo, J.L. (eds.) EvoApplications 2021. LNCS, vol. 12694, pp. 697–713. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-72699-7_44

    Chapter  Google Scholar 

  21. Widrow, B., Hoff, M.E.: Adaptive switching circuits. Stanford Univ Ca Stanford Electronics Labs, Technical report (1960)

    Book  Google Scholar 

  22. Wilson, S.W.: Classifier fitness based on accuracy. Evol. Comput. 3(2), 149–175 (1995)

    Article  Google Scholar 

  23. Wilson, S.W.: Get real! XCS with continuous-valued inputs. In: Lanzi, P.L., Stolzmann, W., Wilson, S.W. (eds.) IWLCS 1999. LNCS (LNAI), vol. 1813, pp. 209–219. Springer, Heidelberg (2000). https://doi.org/10.1007/3-540-45027-0_11

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hiroki Shiraishi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Shiraishi, H., Hayamizu, Y., Nakari, I., Sato, H., Takadama, K. (2022). Inheritance vs. Expansion: Generalization Degree of Nearest Neighbor Rule in Continuous Space as Covering Operator of XCS. In: Jiménez Laredo, J.L., Hidalgo, J.I., Babaagba, K.O. (eds) Applications of Evolutionary Computation. EvoApplications 2022. Lecture Notes in Computer Science, vol 13224. Springer, Cham. https://doi.org/10.1007/978-3-031-02462-7_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-02462-7_23

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-02461-0

  • Online ISBN: 978-3-031-02462-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation