ERNIE-Search: Bridging Cross-Encoder with Dual-Encoder via Self On-the-fly Distillation for Dense Passage Retrieval (2205.09153v1)
Abstract: Neural retrievers based on pre-trained LLMs (PLMs), such as dual-encoders, have achieved promising performance on the task of open-domain question answering (QA). Their effectiveness can further reach new state-of-the-arts by incorporating cross-architecture knowledge distillation. However, most of the existing studies just directly apply conventional distillation methods. They fail to consider the particular situation where the teacher and student have different structures. In this paper, we propose a novel distillation method that significantly advances cross-architecture distillation for dual-encoders. Our method 1) introduces a self on-the-fly distillation method that can effectively distill late interaction (i.e., ColBERT) to vanilla dual-encoder, and 2) incorporates a cascade distillation process to further improve the performance with a cross-encoder teacher. Extensive experiments are conducted to validate that our proposed solution outperforms strong baselines and establish a new state-of-the-art on open-domain QA benchmarks.
- Yuxiang Lu (26 papers)
- Yiding Liu (30 papers)
- Jiaxiang Liu (39 papers)
- Yunsheng Shi (5 papers)
- Zhengjie Huang (25 papers)
- Shikun Feng Yu Sun (1 paper)
- Hao Tian (146 papers)
- Hua Wu (191 papers)
- Shuaiqiang Wang (68 papers)
- Dawei Yin (165 papers)
- Haifeng Wang (194 papers)