Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Can Demographic Factors Improve Text Classification? Revisiting Demographic Adaptation in the Age of Transformers (2210.07362v2)

Published 13 Oct 2022 in cs.CL

Abstract: Demographic factors (e.g., gender or age) shape our language. Previous work showed that incorporating demographic factors can consistently improve performance for various NLP tasks with traditional NLP models. In this work, we investigate whether these previous findings still hold with state-of-the-art pretrained Transformer-based LLMs (PLMs). We use three common specialization methods proven effective for incorporating external knowledge into pretrained Transformers (e.g., domain-specific or geographic knowledge). We adapt the language representations for the demographic dimensions of gender and age, using continuous LLMing and dynamic multi-task learning for adaptation, where we couple LLMing objectives with the prediction of demographic classes. Our results, when employing a multilingual PLM, show substantial gains in task performance across four languages (English, German, French, and Danish), which is consistent with the results of previous work. However, controlling for confounding factors - primarily domain and language proficiency of Transformer-based PLMs - shows that downstream performance gains from our demographic adaptation do not actually stem from demographic knowledge. Our results indicate that demographic specialization of PLMs, while holding promise for positive societal impact, still represents an unsolved problem for (modern) NLP.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Chia-Chien Hung (11 papers)
  2. Anne Lauscher (58 papers)
  3. Dirk Hovy (57 papers)
  4. Simone Paolo Ponzetto (52 papers)
  5. Goran Glavaš (82 papers)
Citations (13)