Rethinking Relation Classification with Graph Meaning Representations (2310.09772v2)
Abstract: In the field of natural language understanding, the intersection of neural models and graph meaning representations (GMRs) remains a compelling area of research. Despite the growing interest, a critical gap persists in understanding the exact influence of GMRs, particularly concerning relation extraction tasks. Addressing this, we introduce DAGNN-plus, a simple and parameter-efficient neural architecture designed to decouple contextual representation learning from structural information propagation. Coupled with various sequence encoders and GMRs, this architecture provides a foundation for systematic experimentation on two English and two Chinese datasets. Our empirical analysis utilizes four different graph formalisms and nine parsers. The results yield a nuanced understanding of GMRs, showing improvements in three out of the four datasets, particularly favoring English over Chinese due to highly accurate parsers. Interestingly, GMRs appear less effective in literary-domain datasets compared to general-domain datasets. These findings lay the groundwork for better-informed design of GMRs and parsers to improve relation classification, which is expected to tangibly impact the future trajectory of natural language understanding research.
- Does injecting linguistic structure into language models lead to better alignment with brain recordings?
- Omri Abend and Ari Rappoport. 2013. UCCA: A semantics-based grammatical annotation scheme. In Proceedings of the 10th International Conference on Computational Semantics (IWCS 2013) – Long Papers, pages 1–12, Potsdam, Germany. Association for Computational Linguistics.
- Evaluation of out-of-domain dependency parsing for its application in a digital humanities project. In KONVENS, pages 121–135.
- Emotion classification in texts over graph neural networks: Semantic representation is better than syntactic. IEEE Access, 11:56921–56934.
- Graph pre-training for AMR parsing and generation. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6001–6015, Dublin, Ireland. Association for Computational Linguistics.
- Abstract Meaning Representation for sembanking. In Proceedings of the 7th Linguistic Annotation Workshop and Interoperability with Discourse, pages 178–186, Sofia, Bulgaria. Association for Computational Linguistics.
- Silver syntax pre-training for cross-domain relation extraction. In Association for Computational Linguistics: ACL 2023.
- Emanuele Bugliarello and Naoaki Okazaki. 2020. Enhancing machine translation with dependency-aware self-attention. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 1618–1627, Online. Association for Computational Linguistics.
- Discriminative reordering with Chinese grammatical relations features. In Proceedings of the Third Workshop on Syntax and Structure in Statistical Translation (SSST-3) at NAACL HLT 2009, pages 51–59, Boulder, Colorado. Association for Computational Linguistics.
- HIT-SCIR at MRP 2019: A unified pipeline for meaning representation parsing via efficient training and effective encoding. In Proceedings of the Shared Task on Cross-Framework Meaning Representation Parsing at the 2019 Conference on Natural Language Learning, pages 76–85, Hong Kong. Association for Computational Linguistics.
- N-LTP: An open-source neural language technology platform for Chinese. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pages 42–49, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
- SemEval-2016 task 9: Chinese semantic dependency parsing. In Proceedings of the 10th International Workshop on Semantic Evaluation (SemEval-2016), pages 1074–1080, San Diego, California. Association for Computational Linguistics.
- Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 3438–3445.
- Relation extraction with type-aware map memories of word dependencies. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pages 2501–2512, Online. Association for Computational Linguistics.
- Supervised relation classification as two-way span-prediction. Automated Knowledge Base Construction.
- Minimal Recursion Semantics: An Introduction. Cambridge University Press.
- Revisiting pre-trained models for Chinese natural language processing. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 657–668, Online. Association for Computational Linguistics.
- Pre-training with whole word masking for chinese bert. IEEE Transactions on Audio, Speech and Language Processing.
- Universal Dependencies. Computational Linguistics, 47(2):255–308.
- BERT: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 4171–4186, Minneapolis, Minnesota. Association for Computational Linguistics.
- Kalpit Dixit and Yaser Al-Onaizan. 2019. Span-level model for relation extraction. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 5308–5314, Florence, Italy. Association for Computational Linguistics.
- Deep dominance - how to properly compare deep neural models. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 2773–2785, Florence, Italy. Association for Computational Linguistics.
- Vijay Prakash Dwivedi and Xavier Bresson. 2021. A generalization of transformer networks to graphs. AAAI Workshop on Deep Learning on Graphs: Methods and Applications.
- Markus Eberts and Adrian Ulges. 2020. Span-based joint entity and relation extraction with transformer pre-training.
- On the conflict between logic and belief in syllogistic reasoning. Memory & cognition, 11(3):295–306.
- The 2018 shared task on extrinsic parser evaluation: On the downstream utility of English Universal Dependency parsers. In Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, pages 22–33, Brussels, Belgium. Association for Computational Linguistics.
- Novel target attention convolutional neural network for relation classification. Information Sciences, 597:24–37.
- On the interaction of belief bias and explanations. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pages 2930–2942, Online. Association for Computational Linguistics.
- Cnn-based chinese ner with lexicon rethinking. In Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, IJCAI-19, pages 4982–4988. International Joint Conferences on Artificial Intelligence Organization.
- Han He and Jinho D. Choi. 2021. The stem cell hypothesis: Dilemma behind multi-task learning with transformer encoders. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 5555–5577, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Deberta: Decoding-enhanced bert with disentangled attention. arXiv preprint arXiv:2006.03654.
- SemEval-2010 task 8: Multi-way classification of semantic relations between pairs of nominals. In Proceedings of the 5th International Workshop on Semantic Evaluation, pages 33–38, Uppsala, Sweden. Association for Computational Linguistics.
- A transition-based directed acyclic graph parser for UCCA. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1127–1138, Vancouver, Canada. Association for Computational Linguistics.
- Content differences in syntactic and semantic representation. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 478–488, Minneapolis, Minnesota. Association for Computational Linguistics.
- Pere-LluÃs Huguet Cabot and Roberto Navigli. 2021. REBEL: Relation extraction by end-to-end language generation. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 2370–2381, Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Who did what to whom? a contrastive study of syntacto-semantic dependencies. In Proceedings of the Sixth Linguistic Annotation Workshop, pages 2–11, Jeju, Republic of Korea. Association for Computational Linguistics.
- Chaitanya Joshi. 2020. Transformers are graph neural networks. The Gradient, page 5.
- SpanBERT: Improving pre-training by representing and predicting spans. Transactions of the Association for Computational Linguistics, 8:64–77.
- Thomas N. Kipf and Max Welling. 2017. Semi-supervised classification with graph convolutional networks. In International Conference on Learning Representations (ICLR).
- On belief bias in syllogistic reasoning. Psychological review, 107(4):852.
- Do neural language models show preferences for syntactic formalisms? In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 4077–4091, Online. Association for Computational Linguistics.
- An empirical examination of challenges in Chinese parsing. In Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 98–103, Sofia, Bulgaria. Association for Computational Linguistics.
- Heather Lent and Anders Søgaard. 2021. Common sense bias in semantic role labeling. In Proceedings of the Seventh Workshop on Noisy User-generated Text (W-NUT 2021), pages 114–119, Online. Association for Computational Linguistics.
- Changmao Li and Jeffrey Flanigan. 2022. Improving neural machine translation with the Abstract Meaning Representation by combining graph and sequence transformers. In Proceedings of the 2nd Workshop on Deep Learning on Graphs for Natural Language Processing (DLG4NLP 2022), pages 12–21, Seattle, Washington. Association for Computational Linguistics.
- Bioknowprompt: Incorporating imprecise knowledge into prompt-tuning verbalizer with biomedical text for relation extraction. Information Sciences, 617:346–358.
- Improving BERT with syntax-aware local attention. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pages 645–653, Online. Association for Computational Linguistics.
- Chinese relation extraction with multi-grained information and external linguistic knowledge. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4377–4386, Florence, Italy. Association for Computational Linguistics.
- Compositional semantic parsing across graphbanks. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 4576–4585, Florence, Italy. Association for Computational Linguistics.
- Towards deeper graph neural networks. In Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining, pages 338–348.
- Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692.
- Universal Dependencies v2: An evergrowing multilingual treebank collection. In Proceedings of the Twelfth Language Resources and Evaluation Conference, pages 4034–4043, Marseille, France. European Language Resources Association.
- Improving coherence and consistency in neural sequence models with dual-system, neuro-symbolic reasoning. Advances in Neural Information Processing Systems, 34:25192–25204.
- MRP 2019: Cross-framework meaning representation parsing. In Proceedings of the Shared Task on Cross-Framework Meaning Representation Parsing at the 2019 Conference on Natural Language Learning, pages 1–27, Hong Kong. Association for Computational Linguistics.
- Slot filling for biomedical information extraction. In Proceedings of the 21st Workshop on Biomedical Language Processing, pages 82–90, Dublin, Ireland. Association for Computational Linguistics.
- Stanza: A python natural language processing toolkit for many human languages. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations, pages 101–108, Online. Association for Computational Linguistics.
- Zhou Qiang. 2004. Annotation scheme for chinese treebank. Journal of Chinese information processing.
- Do syntax trees help pre-trained transformers extract information? In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 2647–2661, Online. Association for Computational Linguistics.
- Alessio Salomoni. 2017. Dependency parsing on late-18th-century german aesthetic writings: A preliminary inquiry into schiller and f. schlegel. In Proceedings of the 2nd International Conference on Digital Access to Textual Cultural Heritage, pages 47–52.
- Modeling relational data with graph convolutional networks. In The Semantic Web: 15th International Conference, ESWC 2018, Heraklion, Crete, Greece, June 3–7, 2018, Proceedings 15, pages 593–607. Springer.
- Neural machine translation of rare words with subword units. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1715–1725, Berlin, Germany. Association for Computational Linguistics.
- Semantics-aware attention improves neural machine translation. In Proceedings of the 11th Joint Conference on Lexical and Computational Semantics, pages 28–43, Seattle, Washington. Association for Computational Linguistics.
- Anders Søgaard. 2021. Locke’s holiday: Belief bias in machine reading. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 8240–8245, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Re-tacred: Addressing shortcomings of the tacred dataset. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 35, pages 13843–13850.
- Milan Straka. 2018. UDPipe 2.0 prototype at CoNLL 2018 UD shared task. In Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, pages 197–207, Brussels, Belgium. Association for Computational Linguistics.
- UDPipe: Trainable pipeline for processing CoNLL-U files performing tokenization, morphological analysis, POS tagging and parsing. In Proceedings of the Tenth International Conference on Language Resources and Evaluation (LREC’16), pages 4290–4297, Portorož, Slovenia. European Language Resources Association (ELRA).
- Linguistically-informed self-attention for semantic role labeling. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 5027–5038, Brussels, Belgium. Association for Computational Linguistics.
- Parser evaluation for analyzing Swedish 19th-20th century literature. In Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa), pages 335–346, Tórshavn, Faroe Islands. University of Tartu Library.
- Dependency-driven relation extraction with attentive graph convolutional networks. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 4458–4471, Online. Association for Computational Linguistics.
- Improving relation extraction through syntax-induced pre-training with dependency masking. In Findings of the Association for Computational Linguistics: ACL 2022, pages 1875–1886, Dublin, Ireland. Association for Computational Linguistics.
- Neural relation extraction for knowledge base enrichment. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 229–240, Florence, Italy. Association for Computational Linguistics.
- deep-significance: Easy and meaningful signifcance testing in the age of neural networks. In ML Evaluation Standards Workshop at the Tenth International Conference on Learning Representations.
- Attention is all you need. Advances in neural information processing systems, 30.
- Graph attention networks. International Conference on Learning Representations.
- Document-level relation extraction with hierarchical dependency tree and bridge path. Knowledge-Based Systems, 278:110873.
- Jue Wang and Wei Lu. 2020. Two are better than one: Joint entity and relation extraction with table-sequence encoders. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 1706–1721, Online. Association for Computational Linguistics.
- Automated concatenation of embeddings for structured prediction. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 2643–2660, Online. Association for Computational Linguistics.
- Google’s neural machine translation system: Bridging the gap between human and machine translation. arXiv preprint arXiv:1609.08144.
- Infusing finetuning with semantic dependencies. Transactions of the Association for Computational Linguistics, 9:226–242.
- A discourse-level named entity recognition and relation extraction dataset for chinese literature text. arXiv preprint arXiv:1711.07010.
- Syntax-enhanced pre-trained model. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 5412–5422, Online. Association for Computational Linguistics.
- The Penn Chinese Treebank: Phrase structure annotation of a large corpus. Natural language engineering, 11(2):207–238.
- LUKE: Deep contextualized entity representations with entity-aware self-attention. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 6442–6454, Online. Association for Computational Linguistics.
- Heterogeneous graph transformer for graph-to-sequence learning. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7145–7154, Online. Association for Computational Linguistics.
- Jonathan Yellin and Omri Abend. 2021. Paths to relation extraction through semantic structure. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, pages 2614–2626, Online. Association for Computational Linguistics.
- Analysis of the difficulties in Chinese deep parsing. In Proceedings of the 12th International Conference on Parsing Technologies, pages 48–57, Dublin, Ireland. Association for Computational Linguistics.
- CoNLL 2018 shared task: Multilingual parsing from raw text to Universal Dependencies. In Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies, pages 1–21, Brussels, Belgium. Association for Computational Linguistics.
- Chinese relation extraction with flat-lattice encoding and pretrain-transfer strategy. In Knowledge Science, Engineering and Management, pages 30–40, Cham. Springer International Publishing.
- A practical chinese dependency parser based on a large-scale dataset.
- Graph convolution over pruned dependency trees improves relation extraction. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 2205–2215, Brussels, Belgium. Association for Computational Linguistics.
- Position-aware attention and supervised data improve slot filling. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 35–45, Copenhagen, Denmark. Association for Computational Linguistics.
- Sg-net: Syntax-guided machine reading comprehension. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 34, pages 9636–9643.
- Zixuan Zhang and Heng Ji. 2021. Abstract Meaning Representation guided graph encoding and decoding for joint information extraction. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 39–49, Online. Association for Computational Linguistics.
- A novel chinese relation extraction method using polysemy rethinking mechanism. Applied Intelligence, pages 1–12.
- Enhancing chinese character representation with lattice-aligned attention. IEEE Transactions on Neural Networks and Learning Systems, pages 1–10.
- A weighted gcn with logical adjacency matrix for relation extraction. In European Conference on Artificial Intelligence.
- Wenxuan Zhou and Muhao Chen. 2022. An improved baseline for sentence-level relation extraction. In Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 161–168, Online only. Association for Computational Linguistics.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.