Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Challenges of Fully Incremental Neural Dependency Parsing (2309.16254v1)

Published 28 Sep 2023 in cs.CL

Abstract: Since the popularization of BiLSTMs and Transformer-based bidirectional encoders, state-of-the-art syntactic parsers have lacked incrementality, requiring access to the whole sentence and deviating from human language processing. This paper explores whether fully incremental dependency parsing with modern architectures can be competitive. We build parsers combining strictly left-to-right neural encoders with fully incremental sequence-labeling and transition-based decoders. The results show that fully incremental parsing with modern architectures considerably lags behind bidirectional parsing, noting the challenges of psycholinguistically plausible parsing.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (45)
  1. An incremental algorithm for transition-based CCG parsing. In Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 53–63, Denver, Colorado. Association for Computational Linguistics.
  2. Mark Anderson and Carlos Gómez-Rodríguez. 2021. A modest Pareto optimisation analysis of dependency parsers in 2021. In Proceedings of the 17th International Conference on Parsing Technologies and the IWPT 2021 Shared Task on Parsing into Enhanced Universal Dependencies (IWPT 2021), pages 119–130, Online. Association for Computational Linguistics.
  3. Douglas K. Bemis and Liina Pylkkänen. 2011. Simple composition: A magnetoencephalography investigation into the comprehension of minimal linguistic phrases. Journal of Neuroscience, 31(8):2801 – 2814. Cited by: 169; All Open Access, Green Open Access, Hybrid Gold Open Access.
  4. Incremental parsing and the evaluation of partial dependency analyses. In Proceedings of the 1st International Conference on Dependency Linguistics. Depling 2011.
  5. Morten H. Christiansen and Nick Chater. 2016. The now-or-never bottleneck: A fundamental constraint on language. Behavioral and Brain Sciences, 39:e62.
  6. An incremental turn-taking model for task-oriented dialog systems. In Proceedings of INTERSPEECH 2019.
  7. Unsupervised cross-lingual representation learning at scale. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 8440–8451, Online. Association for Computational Linguistics.
  8. Michael A. Covington. 2001. A fundamental algorithm for dependency parsing. In Proceedings of the 39th Annual ACM Southeast Conference, pages 95–102.
  9. Vera Demberg and Frank Keller. 2019. Cognitive models of syntax and sentence processing. Human language: From genes and brains to behavior, pages 293–312.
  10. Timothy Dozat and Christopher D. Manning. 2017. Deep biaffine attention for neural dependency parsing. In 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24-26, 2017, Conference Track Proceedings. OpenReview.net.
  11. Probing for incremental parse states in autoregressive language models.
  12. Erik Ekstedt and Gabriel Skantze. 2021. Projection of turn completion in incremental spoken dialogue systems. In Proceedings of the 22nd Annual Meeting of the Special Interest Group on Discourse and Dialogue, pages 431–437, Singapore and Online. Association for Computational Linguistics.
  13. Transition-based parsing with stack-transformers. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 1001–1007, Online. Association for Computational Linguistics.
  14. Daniel Fernández-González and Carlos Gómez-Rodríguez. 2017. A full non-monotonic transition system for unrestricted non-projective parsing. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 288–298, Vancouver, Canada. Association for Computational Linguistics.
  15. Daniel Fernández-González and Carlos Gómez-Rodríguez. 2023. Dependency parsing with bottom-up hierarchical pointer networks. Information Fusion, 91:494–503.
  16. Carlos Gómez-Rodríguez. 2016. Natural language processing and the Now-or-Never bottleneck. Behavioral and Brain Sciences, 39:e74.
  17. Carlos Gómez-Rodríguez and Joakim Nivre. 2013. Divisible transition systems and multiplanar dependency parsing. Computational Linguistics, 39(4):799–845.
  18. Carlos Gómez-Rodríguez and David Vilares. 2018. Constituent parsing as sequence labeling. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 1314–1324, Brussels, Belgium. Association for Computational Linguistics.
  19. Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long short-term memory. Neural computation, 9(8):1735–1780.
  20. A non-monotonic arc-eager transition system for dependency parsing. In Proceedings of the Seventeenth Conference on Computational Natural Language Learning, pages 163–172, Sofia, Bulgaria. Association for Computational Linguistics.
  21. Eliyahu Kiperwasser and Yoav Goldberg. 2016. Simple and accurate dependency parsing using bidirectional LSTM feature representations. Transactions of the Association for Computational Linguistics, 4:313–327.
  22. Learned incremental representations for parsing. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 3086–3095, Dublin, Ireland. Association for Computational Linguistics.
  23. Arne Köhn and Wolfgang Menzel. 2013. Incremental and predictive dependency parsing under real-time conditions. In Proceedings of the International Conference Recent Advances in Natural Language Processing RANLP 2013, pages 373–381, Hissar, Bulgaria. INCOMA Ltd. Shoumen, BULGARIA.
  24. Arne Köhn. 2019. Predictive Dependency Parsing. Ph.D. thesis, Universität Hamburg.
  25. Brielen Madureira and David Schlangen. 2020. Incremental processing in the age of non-incremental encoders: An empirical assessment of bidirectional models for incremental NLU. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 357–374, Online. Association for Computational Linguistics.
  26. William D. Marslen-Wilson. 1985. Speech shadowing and speech comprehension. Speech Communication, 4(1):55–73.
  27. Tim Miller and William Schuler. 2010. HHMM parsing with limited parallelism. In Proceedings of the 2010 Workshop on Cognitive Modeling and Computational Linguistics, pages 27–35, Uppsala, Sweden. Association for Computational Linguistics.
  28. Rethinking self-attention: Towards interpretability in neural parsing. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 731–742, Online. Association for Computational Linguistics.
  29. Joakim Nivre. 2003. An efficient algorithm for projective dependency parsing. In Proceedings of the Eighth International Conference on Parsing Technologies, pages 149–160, Nancy, France.
  30. Joakim Nivre. 2008. Algorithms for deterministic incremental dependency parsing. Computational Linguistics, 34(4):513–553.
  31. Universal Dependencies v2: An evergrowing multilingual treebank collection. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 4034–4043, Marseille, France. European Language Resources Association.
  32. Understanding in an instant: Neurophysiological evidence for mechanistic language circuits in the brain. Brain and Language, 110(2):81–94.
  33. Bloom: A 176b-parameter open-access multilingual language model. arXiv preprint arXiv:2211.05100.
  34. mgpt: Few-shot learners go multilingual.
  35. Modeling incremental language comprehension in the brain with Combinatory Categorial Grammar. In Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, pages 23–38, Online. Association for Computational Linguistics.
  36. Miloš Stanojević and Mark Steedman. 2019. CCG parsing algorithm with incremental tree rotation. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 228–239, Minneapolis, Minnesota. Association for Computational Linguistics.
  37. Miloš Stanojević and Mark Steedman. 2020. Max-margin incremental CCG parsing. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 4111–4122, Online. Association for Computational Linguistics.
  38. A minimal span-based neural constituency parser. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 818–827, Vancouver, Canada. Association for Computational Linguistics.
  39. Viable dependency parsing as sequence labeling. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 717–723, Minneapolis, Minnesota. Association for Computational Linguistics.
  40. Bracketing encodings for 2-planar dependency parsing. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2472–2484, Barcelona, Spain (Online). International Committee on Computational Linguistics.
  41. Grammar as a foreign language. In Advances in Neural Information Processing Systems, volume 28. Curran Associates, Inc.
  42. A targeted assessment of incremental processing in neural language models and humans. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 939–952, Online. Association for Computational Linguistics.
  43. Kaiyu Yang and Jia Deng. 2020. Strongly incremental constituency parsing with graph neural networks. In Proceedings of the 34th International Conference on Neural Information Processing Systems, NIPS’20, Red Hook, NY, USA. Curran Associates Inc.
  44. Yue Zhang and Joakim Nivre. 2011. Transition-based dependency parsing with rich non-local features. In Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pages 188–193, Portland, Oregon, USA. Association for Computational Linguistics.
  45. Junru Zhou and Hai Zhao. 2019. Head-Driven Phrase Structure Grammar parsing on Penn Treebank. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 2396–2408, Florence, Italy. Association for Computational Linguistics.
Citations (1)

Summary

We haven't generated a summary for this paper yet.