Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ST$^2$: Small-data Text Style Transfer via Multi-task Meta-Learning (2004.11742v1)

Published 24 Apr 2020 in cs.CL

Abstract: Text style transfer aims to paraphrase a sentence in one style into another style while preserving content. Due to lack of parallel training data, state-of-art methods are unsupervised and rely on large datasets that share content. Furthermore, existing methods have been applied on very limited categories of styles such as positive/negative and formal/informal. In this work, we develop a meta-learning framework to transfer between any kind of text styles, including personal writing styles that are more fine-grained, share less content and have much smaller training data. While state-of-art models fail in the few-shot style transfer task, our framework effectively utilizes information from other styles to improve both language fluency and style transfer accuracy.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Xiwen Chen (45 papers)
  2. Kenny Q. Zhu (50 papers)
Citations (5)

Summary

We haven't generated a summary for this paper yet.