Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cooperative coevolutionary hybrid NSGA-II with Linkage Measurement Minimization for Large-scale Multi-objective optimization (2208.13415v1)

Published 29 Aug 2022 in cs.NE

Abstract: In this paper, we propose a variable grouping method based on cooperative coevolution for large-scale multi-objective problems (LSMOPs), named Linkage Measurement Minimization (LMM). And for the sub-problem optimization stage, a hybrid NSGA-II with a Gaussian sampling operator based on an estimated convergence point is proposed. In the variable grouping stage, according to our previous research, we treat the variable grouping problem as a combinatorial optimization problem, and the linkage measurement function is designed based on linkage identification by the nonlinearity check on real code (LINC-R). We extend this variable grouping method to LSMOPs. In the sub-problem optimization stage, we hypothesize that there is a higher probability of existing better solutions around the Pareto Front (PF). Based on this hypothesis, we estimate a convergence point at every generation of optimization and perform Gaussian sampling around the convergence point. The samples with good objective value will participate in the optimization as elites. Numerical experiments show that our variable grouping method is better than some popular variable grouping methods, and hybrid NSGA-II has broad prospects for multi-objective problem optimization.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Rui Zhong (20 papers)
  2. Masaharu Munetomo (9 papers)

Summary

We haven't generated a summary for this paper yet.