Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Parameter-efficient Zero-shot Transfer for Cross-Language Dense Retrieval with Adapters (2212.10448v1)

Published 20 Dec 2022 in cs.IR and cs.CL

Abstract: A popular approach to creating a zero-shot cross-language retrieval model is to substitute a monolingual pretrained LLM in the retrieval model with a multilingual pretrained LLM such as Multilingual BERT. This multilingual model is fined-tuned to the retrieval task with monolingual data such as English MS MARCO using the same training recipe as the monolingual retrieval model used. However, such transferred models suffer from mismatches in the languages of the input text during training and inference. In this work, we propose transferring monolingual retrieval models using adapters, a parameter-efficient component for a transformer network. By adding adapters pretrained on language tasks for a specific language with task-specific adapters, prior work has shown that the adapter-enhanced models perform better than fine-tuning the entire model when transferring across languages in various NLP tasks. By constructing dense retrieval models with adapters, we show that models trained with monolingual data are more effective than fine-tuning the entire model when transferring to a Cross Language Information Retrieval (CLIR) setting. However, we found that the prior suggestion of replacing the language adapters to match the target language at inference time is suboptimal for dense retrieval models. We provide an in-depth analysis of this discrepancy between other cross-language NLP tasks and CLIR.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Eugene Yang (37 papers)
  2. Suraj Nair (39 papers)
  3. Dawn Lawrie (30 papers)
  4. James Mayfield (21 papers)
  5. Douglas W. Oard (18 papers)
Citations (4)