Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

BERT-EMD: Many-to-Many Layer Mapping for BERT Compression with Earth Mover's Distance (2010.06133v1)

Published 13 Oct 2020 in cs.CL and cs.LG

Abstract: Pre-trained LLMs (e.g., BERT) have achieved significant success in various NLP tasks. However, high storage and computational costs obstruct pre-trained LLMs to be effectively deployed on resource-constrained devices. In this paper, we propose a novel BERT distillation method based on many-to-many layer mapping, which allows each intermediate student layer to learn from any intermediate teacher layers. In this way, our model can learn from different teacher layers adaptively for various NLP tasks. %motivated by the intuition that different NLP tasks require different levels of linguistic knowledge contained in the intermediate layers of BERT. In addition, we leverage Earth Mover's Distance (EMD) to compute the minimum cumulative cost that must be paid to transform knowledge from teacher network to student network. EMD enables the effective matching for many-to-many layer mapping. %EMD can be applied to network layers with different sizes and effectively measures semantic distance between the teacher network and student network. Furthermore, we propose a cost attention mechanism to learn the layer weights used in EMD automatically, which is supposed to further improve the model's performance and accelerate convergence time. Extensive experiments on GLUE benchmark demonstrate that our model achieves competitive performance compared to strong competitors in terms of both accuracy and model compression.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Jianquan Li (18 papers)
  2. Xiaokang Liu (28 papers)
  3. Honghong Zhao (4 papers)
  4. Ruifeng Xu (66 papers)
  5. Min Yang (239 papers)
  6. Yaohong Jin (2 papers)
Citations (51)
Github Logo Streamline Icon: https://streamlinehq.com

GitHub