2000 character limit reached
A Globally Normalized Neural Model for Semantic Parsing (2106.03376v1)
Published 7 Jun 2021 in cs.CL
Abstract: In this paper, we propose a globally normalized model for context-free grammar (CFG)-based semantic parsing. Instead of predicting a probability, our model predicts a real-valued score at each step and does not suffer from the label bias problem. Experiments show that our approach outperforms locally normalized models on small datasets, but it does not yield improvement on a large dataset.
- Chenyang Huang (25 papers)
- Wei Yang (349 papers)
- Yanshuai Cao (30 papers)
- Lili Mou (79 papers)
- Osmar Zaïane (4 papers)