Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

OptABC: an Optimal Hyperparameter Tuning Approach for Machine Learning Algorithms (2112.08511v1)

Published 15 Dec 2021 in cs.NE and cs.LG

Abstract: Hyperparameter tuning in machine learning algorithms is a computationally challenging task due to the large-scale nature of the problem. In order to develop an efficient strategy for hyper-parameter tuning, one promising solution is to use swarm intelligence algorithms. Artificial Bee Colony (ABC) optimization lends itself as a promising and efficient optimization algorithm for this purpose. However, in some cases, ABC can suffer from a slow convergence rate or execution time due to the poor initial population of solutions and expensive objective functions. To address these concerns, a novel algorithm, OptABC, is proposed to help ABC algorithm in faster convergence toward a near-optimum solution. OptABC integrates artificial bee colony algorithm, K-Means clustering, greedy algorithm, and opposition-based learning strategy for tuning the hyper-parameters of different machine learning models. OptABC employs these techniques in an attempt to diversify the initial population, and hence enhance the convergence ability without significantly decreasing the accuracy. In order to validate the performance of the proposed method, we compare the results with previous state-of-the-art approaches. Experimental results demonstrate the effectiveness of the OptABC compared to existing approaches in the literature.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Leila Zahedi (3 papers)
  2. Farid Ghareh Mohammadi (18 papers)
  3. M. Hadi Amini (42 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.