Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CoDa: Constrained Generation based Data Augmentation for Low-Resource NLP (2404.00415v1)

Published 30 Mar 2024 in cs.CL

Abstract: We present CoDa (Constrained Generation based Data Augmentation), a controllable, effective, and training-free data augmentation technique for low-resource (data-scarce) NLP. Our approach is based on prompting off-the-shelf instruction-following LLMs for generating text that satisfies a set of constraints. Precisely, we extract a set of simple constraints from every instance in the low-resource dataset and verbalize them to prompt an LLM to generate novel and diverse training instances. Our findings reveal that synthetic data that follows simple constraints in the downstream dataset act as highly effective augmentations, and CoDa can achieve this without intricate decoding-time constrained generation techniques or fine-tuning with complex algorithms that eventually make the model biased toward the small number of training instances. Additionally, CoDa is the first framework that provides users explicit control over the augmentation generation process, thereby also allowing easy adaptation to several domains. We demonstrate the effectiveness of CoDa across 11 datasets spanning 3 tasks and 3 low-resource settings. CoDa outperforms all our baselines, qualitatively and quantitatively, with improvements of 0.12%-7.19%. Code is available here: https://github.com/Sreyan88/CoDa

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Chandra Kiran Reddy Evuru (9 papers)
  2. Sreyan Ghosh (46 papers)
  3. Sonal Kumar (30 papers)
  4. Ramaneswaran S (6 papers)
  5. Utkarsh Tyagi (18 papers)
  6. Dinesh Manocha (366 papers)
Citations (3)
X Twitter Logo Streamline Icon: https://streamlinehq.com