Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Småprat: DialoGPT for Natural Language Generation of Swedish Dialogue by Transfer Learning (2110.06273v2)

Published 12 Oct 2021 in cs.CL and cs.LG

Abstract: Building open-domain conversational systems (or chatbots) that produce convincing responses is a recognized challenge. Recent state-of-the-art (SoTA) transformer-based models for the generation of natural language dialogue have demonstrated impressive performance in simulating human-like, single-turn conversations in English. This work investigates, by an empirical study, the potential for transfer learning of such models to Swedish language. DialoGPT, an English language pre-trained model, is adapted by training on three different Swedish language conversational datasets obtained from publicly available sources. Perplexity score (an automated intrinsic LLM metric) and surveys by human evaluation were used to assess the performances of the fine-tuned models, with results that indicate that the capacity for transfer learning can be exploited with considerable success. Human evaluators asked to score the simulated dialogue judged over 57% of the chatbot responses to be human-like for the model trained on the largest (Swedish) dataset. We provide the demos and model checkpoints of our English and Swedish chatbots on the HuggingFace platform for public use.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Tosin Adewumi (27 papers)
  2. Rickard Brännvall (7 papers)
  3. Nosheen Abid (6 papers)
  4. Maryam Pahlavan (1 paper)
  5. Sana Sabah Sabry (5 papers)
  6. Foteini Liwicki (16 papers)
  7. Marcus Liwicki (86 papers)
Citations (19)