Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

AMPSO: Artificial Multi-Swarm Particle Swarm Optimization (2004.07561v2)

Published 16 Apr 2020 in cs.NE

Abstract: In this paper we propose a novel artificial multi-swarm PSO which consists of an exploration swarm, an artificial exploitation swarm and an artificial convergence swarm. The exploration swarm is a set of equal-sized sub-swarms randomly distributed around the particles space, the exploitation swarm is artificially generated from a perturbation of the best particle of exploration swarm for a fixed period of iterations, and the convergence swarm is artificially generated from a Gaussian perturbation of the best particle in the exploitation swarm as it is stagnated. The exploration and exploitation operations are alternatively carried out until the evolution rate of the exploitation is smaller than a threshold or the maximum number of iterations is reached. An adaptive inertia weight strategy is applied to different swarms to guarantee their performances of exploration and exploitation. To guarantee the accuracy of the results, a novel diversity scheme based on the positions and fitness values of the particles is proposed to control the exploration, exploitation and convergence processes of the swarms. To mitigate the inefficiency issue due to the use of diversity, two swarm update techniques are proposed to get rid of lousy particles such that nice results can be achieved within a fixed number of iterations. The effectiveness of AMPSO is validated on all the functions in the CEC2015 test suite, by comparing with a set of comprehensive set of 16 algorithms, including the most recently well-performing PSO variants and some other non-PSO optimization algorithms.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Haohao Zhou (1 paper)
  2. Zhi-Hui Zhan (6 papers)
  3. Zhi-Xin Yang (16 papers)
  4. Xiangzhi Wei (2 papers)

Summary

We haven't generated a summary for this paper yet.