Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Selection Heuristics on Semantic Genetic Programming for Classification Problems (1907.07066v4)

Published 16 Jul 2019 in cs.LG, cs.NE, and stat.ML

Abstract: Individual's semantics have been used for guiding the learning process of Genetic Programming solving supervised learning problems. The semantics has been used to proposed novel genetic operators as well as different ways of performing parent selection. The latter is the focus of this contribution by proposing three heuristics for parent selection that replace the fitness function on the selection mechanism entirely. These heuristics complement previous work by being inspired in the characteristics of the addition, Naive Bayes, and Nearest Centroid functions and applying them only when the function is used to create an offspring. These heuristics use different similarity measures among the parents to decide which of them is more appropriate given a function. The similarity functions considered are the cosine similarity, Pearson's correlation, and agreement. We analyze these heuristics' performance against random selection, state-of-the-art selection schemes, and 18 classifiers, including auto-machine-learning techniques, on 30 classification problems with a variable number of samples, variables, and classes. The result indicated that the combination of parent selection based on agreement and random selection to replace an individual in the population produces statistically better results than the classical selection and state-of-the-art schemes, and it is competitive with state-of-the-art classifiers. Finally, the code is released as open-source software.

Citations (1)

Summary

We haven't generated a summary for this paper yet.