Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Task Transfer and Domain Adaptation for Zero-Shot Question Answering (2206.06705v1)

Published 14 Jun 2022 in cs.CL and cs.LG

Abstract: Pretrained LLMs have shown success in various areas of natural language processing, including reading comprehension tasks. However, when applying machine learning methods to new domains, labeled data may not always be available. To address this, we use supervised pretraining on source-domain data to reduce sample complexity on domain-specific downstream tasks. We evaluate zero-shot performance on domain-specific reading comprehension tasks by combining task transfer with domain adaptation to fine-tune a pretrained model with no labelled data from the target task. Our approach outperforms Domain-Adaptive Pretraining on downstream domain-specific reading comprehension tasks in 3 out of 4 domains.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Xiang Pan (52 papers)
  2. Alex Sheng (4 papers)
  3. David Shimshoni (1 paper)
  4. Aditya Singhal (3 papers)
  5. Sara Rosenthal (21 papers)
  6. Avirup Sil (45 papers)
Citations (4)