Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Domain Adaptive Text Style Transfer (1908.09395v1)

Published 25 Aug 2019 in cs.CL

Abstract: Text style transfer without parallel data has achieved some practical success. However, in the scenario where less data is available, these methods may yield poor performance. In this paper, we examine domain adaptation for text style transfer to leverage massively available data from other domains. These data may demonstrate domain shift, which impedes the benefits of utilizing such data for training. To address this challenge, we propose simple yet effective domain adaptive text style transfer models, enabling domain-adaptive information exchange. The proposed models presumably learn from the source domain to: (i) distinguish stylized information and generic content information; (ii) maximally preserve content information; and (iii) adaptively transfer the styles in a domain-aware manner. We evaluate the proposed models on two style transfer tasks (sentiment and formality) over multiple target domains where only limited non-parallel data is available. Extensive experiments demonstrate the effectiveness of the proposed model compared to the baselines.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Dianqi Li (18 papers)
  2. Yizhe Zhang (127 papers)
  3. Zhe Gan (135 papers)
  4. Yu Cheng (354 papers)
  5. Chris Brockett (37 papers)
  6. Ming-Ting Sun (16 papers)
  7. Bill Dolan (45 papers)
Citations (51)