Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Generative Pre-Trained Transformer for Design Concept Generation: An Exploration (2111.08489v1)

Published 16 Nov 2021 in cs.CL and cs.LG

Abstract: Novel concepts are essential for design innovation and can be generated with the aid of data stimuli and computers. However, current generative design algorithms focus on diagrammatic or spatial concepts that are either too abstract to understand or too detailed for early phase design exploration. This paper explores the uses of generative pre-trained transformers (GPT) for natural language design concept generation. Our experiments involve the use of GPT-2 and GPT-3 for different creative reasonings in design tasks. Both show reasonably good performance for verbal design concept generation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Qihao Zhu (27 papers)
  2. Jianxi Luo (40 papers)
Citations (48)