Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

IBERT: Idiom Cloze-style reading comprehension with Attention (2112.02994v1)

Published 5 Nov 2021 in cs.CL and cs.AI

Abstract: Idioms are special fixed phrases usually derived from stories. They are commonly used in casual conversations and literary writings. Their meanings are usually highly non-compositional. The idiom cloze task is a challenge problem in NLP research problem. Previous approaches to this task are built on sequence-to-sequence (Seq2Seq) models and achieved reasonably well performance on existing datasets. However, they fall short in understanding the highly non-compositional meaning of idiomatic expressions. They also do not consider both the local and global context at the same time. In this paper, we proposed a BERT-based embedding Seq2Seq model that encodes idiomatic expressions and considers them in both global and local context. Our model uses XLNET as the encoder and RoBERTa for choosing the most probable idiom for a given context. Experiments on the EPIE Static Corpus dataset show that our model performs better than existing state-of-the-arts.

Citations (8)

Summary

We haven't generated a summary for this paper yet.