Log in

RG-NBEO: a ReliefF guided novel binary equilibrium optimizer with opposition-based S-shaped and V-shaped transfer functions for feature selection

  • Published:
Artificial Intelligence Review Aims and scope Submit manuscript

Abstract

In most data mining tasks, feature selection (FS) is a necessary preprocessing step that can reduce the dimensionality of the dataset while ensuring adequate classification accuracy. In this paper, a ReliefF-guided novel binary equilibrium optimizer (RG-NBEO) is proposed for feature selection. Based on the binary equilibrium optimizer, two novel mechanisms are employed to improve the evolution performance. First, two novel transfer functions (SSr and VVr) based on the concept of opposition learning are proposed to transform the continuous search space into a binary search space and achieve a good balance between exploration and exploitation. Second, a ReliefF bootstrap** strategy is proposed to add and remove features directionally in the iterative process according to the feature weights. The simulation experiments are first based on the equilibrium optimizer (EO) variants constructed from the classical S- and V-shaped transfer functions. The variant EO with the best performance is selected and compared with five superior swarm intelligence optimization algorithms and six classical filter feature selection algorithms. The performance of the proposed method was tested on 18 standard datasets, and the results of the different algorithms were statistically evaluated using the Wilcoxon rank sum test and the Freidman rank sum test. The results show that this method can effectively improve the classification accuracy in most cases.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  • Aalaei S, Shahraki H, Rowhanimanesh A et al (2016) Feature selection using genetic algorithm for breast cancer diagnosis: experiment on three different datasets. Iran J Basic Med Sci 19(5):476

    Google Scholar 

  • Abdel-Basset M, Mohamed R, Mirjalili S (2021) A binary equilibrium optimization algorithm for 0–1 knapsack problems. Comput Ind Eng 151:106946

    Google Scholar 

  • Abualigah L, Diabat A (2022) Chaotic binary group search optimizer for feature selection. Expert Syst Appl 192:116368

    Google Scholar 

  • Abualigah LM, Khader AT, Hanandeh ES (2018) A new feature selection method to improve the document clustering using particle swarm optimization algorithm. J Comput Sci 25:456–466

    Google Scholar 

  • Abualigah L, Alsalibi B, Shehab M et al (2021) A parallel hybrid krill herd algorithm for feature selection. Int J Mach Learn Cybern 12(3):783–806

    Google Scholar 

  • Agrawal U, Rohatgi V, Katarya R (2022) Normalized mutual information-based equilibrium optimizer with chaotic maps for wrapper-filter feature selection. Expert Syst Appl 207:118107

    Google Scholar 

  • Ahmadianfar I, Heidari AA, Gandomi AH et al (2021) RUN beyond the metaphor: an efficient optimization algorithm based on Runge Kutta method. Expert Syst Appl 181:115079

    Google Scholar 

  • Apolloni J, Leguizamón G, Alba E (2016) Two hybrid wrapper-filter feature selection algorithms applied to high-dimensional microarray experiments. Appl Soft Comput 38:922–932

    Google Scholar 

  • Awadallah MA, Hammouri AI, Al-Betar MA et al (2022) Binary Horse herd optimization algorithm with crossover operators for feature selection. Comput Biol Med 141:105152

    Google Scholar 

  • Beheshti Z (2020) A time-varying mirrored S-shaped transfer function for binary particle swarm optimization. Inf Sci 512:1503–1542

    MathSciNet  Google Scholar 

  • Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining, pp 333–342

  • Chattopadhyay S, Dey A, Singh PK, et al (2022) A feature selection model for speech emotion recognition using clustering-based population generation with hybrid of equilibrium optimizer and atom search optimization algorithm. Multimed Tools Appl, pp 1–34.

  • Dhiman G, Oliva D, Kaur A et al (2021) BEPO: a novel binary emperor penguin optimizer for automatic feature selection. Knowl-Based Syst 211:106560

    Google Scholar 

  • Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization. IEEE Comput Intell Mag 1(4):28–39

    Google Scholar 

  • Emary E, Zawbaa HM, Hassanien AE (2016) Binary grey wolf optimization approaches for feature selection. Neurocomputing 172:371–381

    Google Scholar 

  • Faramarzi A, Heidarinejad M, Stephens B et al (2020) Equilibrium optimizer: a novel optimization algorithm. Knowl-Based Syst 191:105190

    Google Scholar 

  • Gu Q, Li Z, Han J (2012) Generalized fisher score for feature selection. ar**v preprint ar**v:1202.3725

  • Guo S, Wang J, Guo M (2020) Z-shaped transfer functions for binary particle swarm optimization algorithm. Comput Intell Neurosci 2020:1–2

    Google Scholar 

  • Hamidzadeh J (2021) Feature selection by using chaotic cuckoo optimization algorithm with levy flight, opposition-based learning and disruption operator. Soft Comput 25(4):2911–2933

    Google Scholar 

  • He X, Cai D, Niyogi P (2005) Laplacian score for feature selection. Adv Neural Inf Process Syst, p 18

  • He Y, Wang J, Zhang X et al (2019) Encoding transformation-based differential evolution algorithm for solving knapsack problem with single continuous variable. Swarm Evol Comput 50:100507

    Google Scholar 

  • He Y, Hao X, Li W et al (2021) Binary team game algorithm based on modulo operation for knapsack problem with a single continuous variable. Appl Soft Comput 103:107180

    Google Scholar 

  • He Y, Zhang F, Mirjalili S et al (2022) Novel binary differential evolution algorithm based on Taper-shaped transfer functions for binary optimization problems. Swarm Evol Comput 69:101022

    Google Scholar 

  • Heidari AA, Mirjalili S, Faris H et al (2019) Harris hawks optimization: algorithm and applications. Future Gener Comput Syst 97:849–872

    Google Scholar 

  • Hu P, Pan JS, Chu SC (2020) Improved binary grey wolf optimizer and its application for feature selection. Knowl-Based Syst 195:105746

    Google Scholar 

  • Hu P, Pan JS, Chu SC et al (2022) Multi-surrogate assisted binary particle swarm optimization algorithm and its application for feature selection. Appl Soft Comput 121:108736

    Google Scholar 

  • Jordan MI, Mitchell TM (2015) Machine learning: trends, perspectives, and prospects. Science 349(6245):255–260

    MathSciNet  MATH  Google Scholar 

  • Joshi PM, Verma HK (2021) Binary equilibrium optimizer based weak bus constrained PMU placement. 2021 emerging trends in industry 4.0 (ETI 4.0). IEEE, pp 1–8

  • Kabir MM, Shahjahan M, Murase K (2012) A new hybrid ant colony optimization algorithm for feature selection. Expert Syst Appl 39(3):3747–3763

    Google Scholar 

  • Karaboga D, Basturk B (2007) A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J Glob Optim 39(3):459–471

    MathSciNet  MATH  Google Scholar 

  • Kennedy J, Eberhart RC (1997) A discrete binary version of the particle swarm algorithm. 1997 IEEE international conference on systems, man, and cybernetics. Comput Cybern Simul IEEE 5:4104–4108

    Google Scholar 

  • Khosravi H, Amiri B, Yazdanjue N et al (2022) An improved group teaching optimization algorithm based on local search and chaotic map for feature selection in high-dimensional data. Expert Syst Appl 204:117493

    Google Scholar 

  • Kira K, Rendell LA (1992) The feature selection problem: traditional methods and a new algorithm. Aaai. 2(1992a): 129–134

  • Kononenko I (1994) Estimating attributes: analysis and extensions of RELIEF. European conference on machine learning. Springer, Berlin, Heidelberg, pp 171–182

    Google Scholar 

  • Li S, Chen H, Wang M et al (2020) Slime mould algorithm: a new method for stochastic optimization. Future Gener Comput Syst 111:300–323

    Google Scholar 

  • Li Z, He Y, Li Y et al (2021a) A hybrid grey wolf optimizer for solving the product knapsack problem. Int J Mach Learn Cybern 12(1):201–222

    MathSciNet  Google Scholar 

  • Li AD, Xue B, Zhang M (2021b) Improved binary particle swarm optimization for feature selection with new initialization and search space reduction strategies. Appl Soft Comput 106:107302

    Google Scholar 

  • Liu M, Xu L, Yi J, et al (2018) A feature gene selection method based on ReliefF and PSO. 2018 10th international conference on measuring technology and mechatronics automation (ICMTMA). IEEE, pp 298–301

  • Maleki N, Zeinali Y, Niaki STA (2021) A k-NN method for lung cancer prognosis with the use of a genetic algorithm for feature selection. Expert Syst Appl 164:113981

    Google Scholar 

  • Minocha S, Singh B (2022) A novel phishing detection system using binary modified equilibrium optimizer for feature selection. Comput Electr Eng 98:107689

    Google Scholar 

  • Mirjalili S (2016) Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput Appl 27(4):1053–1073

    MathSciNet  Google Scholar 

  • Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61

    Google Scholar 

  • Mirjalili S, Zhang H, Mirjalili S et al (2020) A novel U-shaped transfer function for binary particle swarm optimisation[M]//Soft Computing for Problem Solving 2019. Springer, Singapore, pp 241–259

    Google Scholar 

  • Mohmmadzadeh H, Gharehchopogh FS (2021) An efficient binary chaotic symbiotic organisms search algorithm approaches for feature selection problems. J Supercomput 77(8):9102–9144

    Google Scholar 

  • Nadimi-Shahraki MH, Banaie-Dezfouli M, Zamani H et al (2021) B-MFO: a binary moth-flame optimization for feature selection from medical datasets. Computers 10(11):136

    Google Scholar 

  • Pashaei E, Pashaei E (2022) An efficient binary chimp optimization algorithm for feature selection in biomedical data classification. Neural Comput Appl 34(8):6427–6451

    Google Scholar 

  • Peng H, Long F, Ding C (2005) Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans Pattern Anal Mach Intell 27(8):1226–1238

    Google Scholar 

  • Poli R, Kennedy J, Blackwell T (2007) Particle swarm optimization. Swarm Intell 1(1):33–57

    Google Scholar 

  • Raileanu LE, Stoffel K (2004) Theoretical comparison between the gini index and information gain criteria. Ann Math Artif Intell 41(1):77–93

    MathSciNet  MATH  Google Scholar 

  • Rizk-Allah RM, Hassanien AE (2022) A hybrid equilibrium algorithm and pattern search technique for wind farm layout optimization problem. ISA transactions

  • Roffo G, Melzi S, Castellani U, et al (2017) Infinite latent feature selection: A probabilistic latent graph-based ranking approach. Proceedings of the IEEE international conference on computer vision, pp 1398–1406

  • Sadeghian Z, Akbari E, Nematzadeh H (2021) A hybrid feature selection method based on information theory and binary butterfly optimization algorithm. Eng Appl Artif Intell 97:104079

    Google Scholar 

  • Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11(4):341–359

    MathSciNet  MATH  Google Scholar 

  • Sun L, Kong X, Xu J et al (2019) A hybrid gene selection method based on ReliefF and ant colony optimization algorithm for tumor classification. Sci Rep 9(1):1–14

    Google Scholar 

  • Sun Y, Pan JS, Hu P, et al (2022) Enhanced equilibrium optimizer algorithm applied in job shop scheduling problem. J Intell Manuf, pp 1–27

  • Tang J, Alelyani S, Liu H (2014) Feature selection for classification: a review. Data Classif: Algorithms Appl, p 37

  • Taradeh M, Mafarja M, Heidari AA et al (2019) An evolutionary gravitational search-based feature selection. Inf Sci 497:219–239

    Google Scholar 

  • Tizhoosh HR (2005) Opposition-based learning: a new scheme for machine intelligence. International conference on computational intelligence for modelling, control and automation and international conference on intelligent agents, web technologies and internet commerce (CIMCA-IAWTIC’06). IEEE 1:695–701

    Google Scholar 

  • Tubishat M, Abushariah MAM, Idris N et al (2019) Improved whale optimization algorithm for feature selection in Arabic sentiment analysis. Appl Intell 49(5):1688–1707

    Google Scholar 

  • Turkoglu B, Kaya E (2020) Training multi-layer perceptron with artificial algae algorithm. Eng Sci Technol Int J 23(6):1342–1350

    Google Scholar 

  • Turkoglu B, Uymaz SA, Kaya E (2022a) Clustering analysis through artificial algae algorithm. Int J Mach Learn Cybern 13(4):1179–1196

    Google Scholar 

  • Turkoglu B, Uymaz SA, Kaya E (2022b) Binary artificial algae algorithm for feature selection. Appl Soft Comput 120:108630

    Google Scholar 

  • Varzaneh ZA, Hossein S, Mood SE et al (2022) A new hybrid feature selection based on improved equilibrium optimization. Chemom Intell Lab Syst 228:104618

    Google Scholar 

  • Wan J, Chen H, Yuan Z et al (2021) A novel hybrid feature selection method considering feature interaction in neighborhood rough set. Knowl-Based Syst 227:107167

    Google Scholar 

  • Wang GG (2018) Moth search algorithm: a bio-inspired metaheuristic algorithm for global optimization problems. Memet Comput 10(2):151–164

    Google Scholar 

  • Wang GG, Deb S, Coelho LDS (2018) Earthworm optimisation algorithm: a bio-inspired metaheuristic algorithm for global optimisation problems. Int J Bio-Inspired Comput 12(1):1–22

    Google Scholar 

  • Wang GG, Deb S, Cui Z (2019) Monarch butterfly optimization. Neural Comput Appl 31(7):1995–2014

    Google Scholar 

  • Xu Y, Jones GJ, Li JT et al (2007) A study on mutual information-based feature selection for text categorization. J Comput Inf Syst 3(3):1007–1012

    Google Scholar 

  • Yang Y, Chen H, Heidari AA et al (2021) Hunger games search: visions, conception, implementation, deep analysis, perspectives, and towards performance shifts. Expert Syst Appl 177:114864

    Google Scholar 

  • Zhang X, Wu G, Dong Z et al (2015) Embedded feature-selection support vector machine for driving pattern recognition. J Franklin Inst 352(2):669–685

    MATH  Google Scholar 

  • Zhang Y, Liu R, Wang X et al (2021) Boosted binary Harris hawks optimizer and feature selection. Eng Comput 37(4):3741–3770

    Google Scholar 

  • Zhao W, Wang L, Zhang Z (2019) Atom search optimization and its application to solve a hydrogeologic parameter estimation problem. Knowl-Based Syst 163:283–304

    Google Scholar 

  • Zhao Y, Dong J, Li X et al (2022) A binary dandelion algorithm using seeding and chaos population strategies for feature selection. Appl Soft Comput 125:109166

    Google Scholar 

  • Zhu H, He Y, Wang X et al (2017) Discrete differential evolutions for the discounted 0–1 knapsack problem. Int J Bio-Inspired Comput 10(4):219–238

    Google Scholar 

Download references

Acknowledgements

This work was supported by the Basic Scientific Research Project of Institution of Higher Learning of Liaoning Province (Grant No. LJKZ0293), and the Project by Liaoning Provincial Natural Science Foundation of China (Grant No. 20180550700).

Author information

Authors and Affiliations

Authors

Contributions

MZ participated in the data collection, analysis, algorithm simulation, and draft writing. J-SW participated in the concept, design, interpretation and commented on the manuscript. J-NH, H-MS, X-DL and F-JG participated in the critical revision of this paper.

Corresponding author

Correspondence to Jie-Sheng Wang.

Ethics declarations

Conflict of interest

The authors declare that there is no conflict of interests regarding the publication of this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, M., Wang, JS., Hou, JN. et al. RG-NBEO: a ReliefF guided novel binary equilibrium optimizer with opposition-based S-shaped and V-shaped transfer functions for feature selection. Artif Intell Rev 56, 6509–6556 (2023). https://doi.org/10.1007/s10462-022-10333-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10462-022-10333-y

Keywords

Navigation