Papers
Topics
Authors
Recent
2000 character limit reached

Dependency-based Mixture Language Models (2203.10256v1)

Published 19 Mar 2022 in cs.CL

Abstract: Various models have been proposed to incorporate knowledge of syntactic structures into neural LLMs. However, previous works have relied heavily on elaborate components for a specific LLM, usually recurrent neural network (RNN), which makes themselves unwieldy in practice to fit into other neural LLMs, such as Transformer and GPT-2. In this paper, we introduce the Dependency-based Mixture LLMs. In detail, we first train neural LLMs with a novel dependency modeling objective to learn the probability distribution of future dependent tokens given context. We then formulate the next-token probability by mixing the previous dependency modeling probability distributions with self-attention. Extensive experiments and human evaluations show that our method can be easily and effectively applied to different neural LLMs while improving neural text generation on various tasks.

Citations (1)

Summary

We haven't generated a summary for this paper yet.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.