Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A fast converging particle swarm optimization through targeted, position-mutated, elitism (PSO-TPME) (2207.00900v2)

Published 2 Jul 2022 in cs.NE and math.OC

Abstract: We dramatically improve convergence speed and global exploration capabilities of particle swarm optimization (PSO) through a targeted position-mutated elitism (PSO-TPME). The three key innovations address particle classification, elitism, and mutation in the cognitive and social model. PSO-TPME is benchmarked against five popular PSO variants for multi-dimensional functions, which are extensively adopted in the optimization field, In particular, the convergence accuracy, convergence speed, and the capability to find global minima is investigated. The statistical error is assessed by numerous repetitions. The simulations demonstrate that proposed PSO variant outperforms the other variants in terms of convergence rate and accuracy by orders of magnitude.

Citations (9)

Summary

We haven't generated a summary for this paper yet.