Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cross-lingual Approaches for Task-specific Dialogue Act Recognition (2005.09260v2)

Published 19 May 2020 in cs.CL and cs.LG

Abstract: In this paper we exploit cross-lingual models to enable dialogue act recognition for specific tasks with a small number of annotations. We design a transfer learning approach for dialogue act recognition and validate it on two different target languages and domains. We compute dialogue turn embeddings with both a CNN and multi-head self-attention model and show that the best results are obtained by combining all sources of transferred information. We further demonstrate that the proposed methods significantly outperform related cross-lingual DA recognition approaches.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Jiří Martínek (4 papers)
  2. Christophe Cerisara (13 papers)
  3. Pavel Král (12 papers)
  4. Ladislav Lenc (9 papers)

Summary

We haven't generated a summary for this paper yet.