Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Unsupervised Dual Paraphrasing for Two-stage Semantic Parsing (2005.13485v3)

Published 27 May 2020 in cs.CL and cs.LG

Abstract: One daunting problem for semantic parsing is the scarcity of annotation. Aiming to reduce nontrivial human labor, we propose a two-stage semantic parsing framework, where the first stage utilizes an unsupervised paraphrase model to convert an unlabeled natural language utterance into the canonical utterance. The downstream naive semantic parser accepts the intermediate output and returns the target logical form. Furthermore, the entire training process is split into two phases: pre-training and cycle learning. Three tailored self-supervised tasks are introduced throughout training to activate the unsupervised paraphrase model. Experimental results on benchmarks Overnight and GeoGranno demonstrate that our framework is effective and compatible with supervised training.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Ruisheng Cao (24 papers)
  2. Su Zhu (29 papers)
  3. Chenyu Yang (20 papers)
  4. Chen Liu (206 papers)
  5. Rao Ma (22 papers)
  6. Yanbin Zhao (14 papers)
  7. Lu Chen (244 papers)
  8. Kai Yu (201 papers)
Citations (46)