Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Multi-Layer Competitive-Cooperative Framework for Performance Enhancement of Differential Evolution (1801.10546v3)

Published 31 Jan 2018 in cs.NE

Abstract: Differential Evolution (DE) is recognized as one of the most powerful optimizers in the evolutionary algorithm (EA) family. Many DE variants were proposed in recent years, but significant differences in performances between them are hardly observed. Therefore, this paper suggests a multi-layer competitive-cooperative (MLCC) framework to facilitate the competition and cooperation of multiple DEs, which in turns, achieve a significant performance improvement. Unlike other multi-method strategies which adopt a multi-population based structure, with individuals only evolving in their corresponding subpopulations, MLCC implements a parallel structure with the entire population simultaneously monitored by multiple DEs assigned to their corresponding layers. An individual can store, utilize and update its evolution information in different layers based on an individual preference based layer selecting (IPLS) mechanism and a computational resource allocation bias (RAB) mechanism. In IPLS, individuals connect to only one favorite layer. While in RAB, high-quality solutions are evolved by considering all the layers. Thus DEs associated in the layers work in a competitive and cooperative manner. The proposed MLCC framework has been implemented on several highly competitive DEs. Experimental studies show that the MLCC variants significantly outperform the baseline DEs as well as several state-of-the-art and up-to-date DEs on CEC benchmark functions.

Citations (30)

Summary

We haven't generated a summary for this paper yet.