Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compact NSGA-II for Multi-objective Feature Selection (2402.12625v1)

Published 20 Feb 2024 in cs.LG and cs.NE

Abstract: Feature selection is an expensive challenging task in machine learning and data mining aimed at removing irrelevant and redundant features. This contributes to an improvement in classification accuracy, as well as the budget and memory requirements for classification, or any other post-processing task conducted after feature selection. In this regard, we define feature selection as a multi-objective binary optimization task with the objectives of maximizing classification accuracy and minimizing the number of selected features. In order to select optimal features, we have proposed a binary Compact NSGA-II (CNSGA-II) algorithm. Compactness represents the population as a probability distribution to enhance evolutionary algorithms not only to be more memory-efficient but also to reduce the number of fitness evaluations. Instead of holding two populations during the optimization process, our proposed method uses several Probability Vectors (PVs) to generate new individuals. Each PV efficiently explores a region of the search space to find non-dominated solutions instead of generating candidate solutions from a small population as is the common approach in most evolutionary algorithms. To the best of our knowledge, this is the first compact multi-objective algorithm proposed for feature selection. The reported results for expensive optimization cases with a limited budget on five datasets show that the CNSGA-II performs more efficiently than the well-known NSGA-II method in terms of the hypervolume (HV) performance metric requiring less memory. The proposed method and experimental results are explained and analyzed in detail.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (28)
  1. G. Chandrashekar and F. Sahin, “A survey on feature selection methods,” Computers & Electrical Engineering, vol. 40, no. 1, pp. 16–28, 2014.
  2. M. F. Ghalwash, X. H. Cao, I. Stojkovic, and Z. Obradovic, “Structured feature selection using coordinate descent optimization,” BMC bioinformatics, vol. 17, no. 1, pp. 1–14, 2016.
  3. A. A. Bidgoli, H. Ebrahimpour-Komleh, and S. Rahnamayan, “An evolutionary decomposition-based multi-objective feature selection for multi-label classification,” PeerJ Computer Science, vol. 6, p. e261, 2020.
  4. T. M. Hamdani, J.-M. Won, A. M. Alimi, and F. Karray, “Multi-objective feature selection with nsga ii,” in Adaptive and Natural Computing Algorithms: 8th International Conference, ICANNGA 2007, Warsaw, Poland, April 11-14, 2007, Proceedings, Part I 8.   Springer, 2007, pp. 240–247.
  5. K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: Nsga-ii,” IEEE transactions on evolutionary computation, vol. 6, no. 2, pp. 182–197, 2002.
  6. R. Jiao, B. Xue, and M. Zhang, “Solving multi-objective feature selection problems in classification via problem reformulation and duplication handling,” IEEE Transactions on Evolutionary Computation, 2022.
  7. F. Cheng, J. J. Cui, Q. J. Wang, and L. Zhang, “A variable granularity search based multi-objective feature selection algorithm for high-dimensional data classification,” IEEE Transactions on Evolutionary Computation, 2022.
  8. G. R. Harik, F. G. Lobo, and D. E. Goldberg, “The compact genetic algorithm,” IEEE transactions on evolutionary computation, vol. 3, no. 4, pp. 287–297, 1999.
  9. C. Aporntewan and P. Chongstitvatana, “A hardware implementation of the compact genetic algorithm,” in Proceedings of the 2001 congress on evolutionary computation (ieee cat. no. 01th8546), vol. 1.   IEEE, 2001, pp. 624–629.
  10. J. C. Gallagher, S. Vigraham, and G. Kramer, “A family of compact genetic algorithms for intrinsic evolvable hardware,” IEEE Transactions on evolutionary computation, vol. 8, no. 2, pp. 111–126, 2004.
  11. Y. Jewajinda and P. Chongstitvatana, “Cellular compact genetic algorithm for evolvable hardware,” in 2008 5th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, vol. 1.   IEEE, 2008, pp. 1–4.
  12. E. Mininno, F. Cupertino, and D. Naso, “Real-valued compact genetic algorithms for embedded microcontroller optimization,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 2, pp. 203–219, 2008.
  13. G. Iacca and F. Caraffini, “Compact optimization algorithms with re-sampled inheritance,” in Applications of Evolutionary Computation: 22nd International Conference, EvoApplications 2019, Held as Part of EvoStar 2019, Leipzig, Germany, April 24–26, 2019, Proceedings 22.   Springer, 2019, pp. 523–534.
  14. ——, “Re-sampled inheritance compact optimization,” Knowledge-Based Systems, vol. 208, p. 106416, 2020.
  15. J. M. O. Velazquez, C. A. C. Coello, and A. Arias-Montano, “Multi-objective compact differential evolution,” in 2014 IEEE symposium on differential evolution (SDE).   IEEE, 2014, pp. 1–8.
  16. E. Mininno, F. Neri, F. Cupertino, and D. Naso, “Compact differential evolution,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 1, pp. 32–54, 2010.
  17. C. W. Ahn and R. S. Ramakrishna, “Elitism-based compact genetic algorithms,” IEEE Transactions on Evolutionary Computation, vol. 7, no. 4, pp. 367–385, 2003.
  18. G. Harik et al., “Linkage learning via probabilistic modeling in the ecga,” IlliGAL report, vol. 99010, 1999.
  19. G. R. Harik, F. G. Lobo, and K. Sastry, “Linkage learning via probabilistic modeling in the extended compact genetic algorithm (ecga),” Scalable optimization via probabilistic modeling, pp. 39–61, 2006.
  20. K. Sastry and D. E. Goldberg, “On extended compact genetic algorithm,” in Late-Breaking Paper at the Genetic and Evolutionary Computation Conference, 2000, pp. 352–359.
  21. F. Neri, E. Mininno, and G. Iacca, “Compact particle swarm optimization,” Information Sciences, vol. 239, pp. 96–121, 2013.
  22. W.-M. Zheng, N. Liu, Q.-W. Chai, and S.-C. Chu, “A compact adaptive particle swarm optimization algorithm in the application of the mobile sensor localization,” Wireless Communications and Mobile Computing, vol. 2021, pp. 1–15, 2021.
  23. N. Liu, Q.-W. Chai, S. Liu, W.-M. Zheng et al., “A novel compact particle swarm optimization for optimizing coverage of 3d in wireless sensor network,” Wireless Communications and Mobile Computing, vol. 2022, 2022.
  24. A. Asilian Bidgoli, S. Rahnamayan, B. Erdem, Z. Erdem, A. Ibrahim, K. Deb, and A. Grami, “Machine learning-based framework to cover optimal pareto-front in many-objective optimization,” Complex & Intelligent Systems, vol. 8, no. 6, pp. 5287–5308, 2022.
  25. A. A. Bidgoli, H. Ebrahimpour-Komleh, and S. Rahnamayan, “Reference-point-based multi-objective optimization algorithm with opposition-based voting scheme for multi-label feature selection,” Information Sciences, vol. 547, pp. 1–17, 2021.
  26. Z. Zhao, F. Morstatter, S. Sharma, S. Alelyani, A. Anand, and H. Liu, “Advancing feature selection research,” ASU feature selection repository, pp. 1–28, 2010.
  27. H. Xu, B. Xue, and M. Zhang, “A duplication analysis-based evolutionary algorithm for biobjective feature selection,” IEEE Transactions on Evolutionary Computation, vol. 25, no. 2, pp. 205–218, 2020.
  28. J. Adamczyk. (2020) Make knn 300 times faster than scikit-learn’s in 20 lines! [Online]. Available: https://towardsdatascience.com/make-knn-300-times-faster-than-scikit-learns-in-20-lines-5e29d74e76bb
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets