Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

LightMBERT: A Simple Yet Effective Method for Multilingual BERT Distillation (2103.06418v1)

Published 11 Mar 2021 in cs.CL

Abstract: The multilingual pre-trained LLMs (e.g, mBERT, XLM and XLM-R) have shown impressive performance on cross-lingual natural language understanding tasks. However, these models are computationally intensive and difficult to be deployed on resource-restricted devices. In this paper, we propose a simple yet effective distillation method (LightMBERT) for transferring the cross-lingual generalization ability of the multilingual BERT to a small student model. The experiment results empirically demonstrate the efficiency and effectiveness of LightMBERT, which is significantly better than the baselines and performs comparable to the teacher mBERT.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Xiaoqi Jiao (8 papers)
  2. Yichun Yin (27 papers)
  3. Lifeng Shang (90 papers)
  4. Xin Jiang (242 papers)
  5. Xiao Chen (277 papers)
  6. Linlin Li (31 papers)
  7. Fang Wang (116 papers)
  8. Qun Liu (230 papers)
Citations (9)