TinyStyler: Efficient Few-Shot Text Style Transfer with Authorship Embeddings (2406.15586v2)
Abstract: The goal of text style transfer is to transform the style of texts while preserving their original meaning, often with only a few examples of the target style. Existing style transfer methods generally rely on the few-shot capabilities of LLMs or on complex controllable text generation approaches that are inefficient and underperform on fluency metrics. We introduce TinyStyler, a lightweight but effective approach, which leverages a small LLM (800M params) and pre-trained authorship embeddings to perform efficient, few-shot text style transfer. We evaluate on the challenging task of authorship style transfer and find TinyStyler outperforms strong approaches such as GPT-4. We also evaluate TinyStyler's ability to perform text attribute style transfer (formal $\leftrightarrow$ informal) with automatic and human evaluations and find that the approach outperforms recent controllable text generation methods. Our model has been made publicly available at https://huggingface.co/tinystyler/tinystyler .
- Zachary Horvitz (5 papers)
- Ajay Patel (17 papers)
- Kanishk Singh (4 papers)
- Chris Callison-Burch (102 papers)
- Kathleen McKeown (85 papers)
- Zhou Yu (206 papers)