Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generating and Blending Game Levels via Quality-Diversity in the Latent Space of a Variational Autoencoder (2102.12463v2)

Published 24 Feb 2021 in cs.LG and cs.AI

Abstract: Several works have demonstrated the use of variational autoencoders (VAEs) for generating levels in the style of existing games and blending levels across different games. Further, quality-diversity (QD) algorithms have also become popular for generating varied game content by using evolution to explore a search space while focusing on both variety and quality. To reap the benefits of both these approaches, we present a level generation and game blending approach that combines the use of VAEs and QD algorithms. Specifically, we train VAEs on game levels and run the MAP-Elites QD algorithm using the learned latent space of the VAE as the search space. The latent space captures the properties of the games whose levels we want to generate and blend, while MAP-Elites searches this latent space to find a diverse set of levels optimizing a given objective such as playability. We test our method using models for 5 different platformer games as well as a blended domain spanning 3 of these games. We refer to using MAP-Elites for blending as Blend-Elites. Our results show that MAP-Elites in conjunction with VAEs enables the generation of a diverse set of playable levels not just for each individual game but also for the blended domain while illuminating game-specific regions of the blended latent space.

Citations (29)

Summary

We haven't generated a summary for this paper yet.