Papers
Topics
Authors
Recent
2000 character limit reached

Ordered Memory Baselines

Published 8 Feb 2023 in cs.CL and cs.LG | (2302.06451v1)

Abstract: Natural language semantics can be modeled using the phrase-structured model, which can be represented using a tree-type architecture. As a result, recent advances in natural language processing have been made utilising recursive neural networks using memory models that allow them to infer tree-type representations of the input sentence sequence. These new tree models have allowed for improvements in sentiment analysis and semantic recognition. Here we review the Ordered Memory model proposed by Shen et al. (2019) at the NeurIPS 2019 conference, and try to either create baselines that can perform better or create simpler models that can perform equally as well. We found that the Ordered Memory model performs on par with the state-of-the-art models used in tree-type modelling, and performs better than simplified baselines that require fewer parameters.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.