Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

MSP: Multi-Stage Prompting for Making Pre-trained Language Models Better Translators (2110.06609v2)

Published 13 Oct 2021 in cs.CL

Abstract: Prompting has recently been shown as a promising approach for applying pre-trained LLMs to perform downstream tasks. We present Multi-Stage Prompting (MSP), a simple and automatic approach for leveraging pre-trained LLMs to translation tasks. To better mitigate the discrepancy between pre-training and translation, MSP divides the translation process via pre-trained LLMs into multiple separate stages: the encoding stage, the re-encoding stage, and the decoding stage. During each stage, we independently apply different continuous prompts for allowing pre-trained LLMs better shift to translation tasks. We conduct extensive experiments on three translation tasks. Experiments show that our method can significantly improve the translation performance of pre-trained LLMs.

Citations (47)

Summary

We haven't generated a summary for this paper yet.