Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variator: Accelerating Pre-trained Models with Plug-and-Play Compression Modules (2310.15724v2)

Published 24 Oct 2023 in cs.CL

Abstract: Pre-trained LLMs (PLMs) have achieved remarkable results on NLP tasks but at the expense of huge parameter sizes and the consequent computational costs. In this paper, we propose Variator, a parameter-efficient acceleration method that enhances computational efficiency through plug-and-play compression plugins. Compression plugins are designed to reduce the sequence length via compressing multiple hidden vectors into one and trained with original PLMs frozen. Different from traditional model acceleration methods, which compress PLMs to smaller sizes, Variator offers two distinct advantages: (1) In real-world applications, the plug-and-play nature of our compression plugins enables dynamic selection of different compression plugins with varying acceleration ratios based on the current workload. (2) The compression plugin comprises a few compact neural network layers with minimal parameters, significantly saving storage and memory overhead, particularly in scenarios with a growing number of tasks. We validate the effectiveness of Variator on seven datasets. Experimental results show that Variator can save 53% computational costs using only 0.9% additional parameters with a performance drop of less than 2%. Moreover, when the model scales to billions of parameters, Variator matches the strong performance of uncompressed PLMs.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (11)
  1. Chaojun Xiao (39 papers)
  2. Yuqi Luo (2 papers)
  3. Wenbin Zhang (71 papers)
  4. Pengle Zhang (7 papers)
  5. Xu Han (270 papers)
  6. Yankai Lin (125 papers)
  7. Zhengyan Zhang (46 papers)
  8. Ruobing Xie (97 papers)
  9. Zhiyuan Liu (433 papers)
  10. Maosong Sun (337 papers)
  11. Jie Zhou (687 papers)

Summary

We haven't generated a summary for this paper yet.