2000 character limit reached
Generative Pre-Trained Transformer for Design Concept Generation: An Exploration (2111.08489v1)
Published 16 Nov 2021 in cs.CL and cs.LG
Abstract: Novel concepts are essential for design innovation and can be generated with the aid of data stimuli and computers. However, current generative design algorithms focus on diagrammatic or spatial concepts that are either too abstract to understand or too detailed for early phase design exploration. This paper explores the uses of generative pre-trained transformers (GPT) for natural language design concept generation. Our experiments involve the use of GPT-2 and GPT-3 for different creative reasonings in design tasks. Both show reasonably good performance for verbal design concept generation.
- Qihao Zhu (27 papers)
- Jianxi Luo (40 papers)