Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

When large language models meet evolutionary algorithms (2401.10510v2)

Published 19 Jan 2024 in cs.NE, cs.AI, cs.CL, and cs.LG

Abstract: Pre-trained LLMs have powerful capabilities for generating creative natural text. Evolutionary algorithms (EAs) can discover diverse solutions to complex real-world problems. Motivated by the common collective and directionality of text generation and evolution, this paper illustrates the parallels between LLMs and EAs, which includes multiple one-to-one key characteristics: token representation and individual representation, position encoding and fitness shaping, position embedding and selection, Transformers block and reproduction, and model training and parameter adaptation. By examining these parallels, we analyze existing interdisciplinary research, with a specific focus on evolutionary fine-tuning and LLM-enhanced EAs. Drawing from these insights, valuable future directions are presented for advancing the integration of LLMs and EAs, while highlighting key challenges along the way. These parallels not only reveal the evolution mechanism behind LLMs but also facilitate the development of evolved artificial agents that approach or surpass biological organisms.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Wang Chao (7 papers)
  2. Jiaxuan Zhao (13 papers)
  3. Licheng Jiao (109 papers)
  4. Lingling Li (34 papers)
  5. Fang Liu (800 papers)
  6. Shuyuan Yang (36 papers)
Citations (13)
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets