Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cross-lingual Contextualized Topic Models with Zero-shot Learning (2004.07737v2)

Published 16 Apr 2020 in cs.CL

Abstract: Many data sets (e.g., reviews, forums, news, etc.) exist parallelly in multiple languages. They all cover the same content, but the linguistic differences make it impossible to use traditional, bag-of-word-based topic models. Models have to be either single-language or suffer from a huge, but extremely sparse vocabulary. Both issues can be addressed by transfer learning. In this paper, we introduce a zero-shot cross-lingual topic model. Our model learns topics on one language (here, English), and predicts them for unseen documents in different languages (here, Italian, French, German, and Portuguese). We evaluate the quality of the topic predictions for the same document in different languages. Our results show that the transferred topics are coherent and stable across languages, which suggests exciting future research directions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Federico Bianchi (47 papers)
  2. Silvia Terragni (8 papers)
  3. Dirk Hovy (57 papers)
  4. Debora Nozza (17 papers)
  5. Elisabetta Fersini (8 papers)
Citations (132)

Summary

We haven't generated a summary for this paper yet.