Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Automatic Generation of German Drama Texts Using Fine Tuned GPT-2 Models (2301.03119v2)

Published 8 Jan 2023 in cs.CL

Abstract: This study is devoted to the automatic generation of German drama texts. We suggest an approach consisting of two key steps: fine-tuning a GPT-2 model (the outline model) to generate outlines of scenes based on keywords and fine-tuning a second model (the generation model) to generate scenes from the scene outline. The input for the neural model comprises two datasets: the German Drama Corpus (GerDraCor) and German Text Archive (Deutsches Textarchiv or DTA). In order to estimate the effectiveness of the proposed method, our models are compared with baseline GPT-2 models. Our models perform well according to automatic quantitative evaluation, but, conversely, manual qualitative analysis reveals a poor quality of generated texts. This may be due to the quality of the dataset or training inputs.

Citations (2)

Summary

We haven't generated a summary for this paper yet.