Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Tri-level Joint Natural Language Understanding for Multi-turn Conversational Datasets (2305.17729v1)

Published 28 May 2023 in cs.CL

Abstract: Natural language understanding typically maps single utterances to a dual level semantic frame, sentence level intent and slot labels at the word level. The best performing models force explicit interaction between intent detection and slot filling. We present a novel tri-level joint natural language understanding approach, adding domain, and explicitly exchange semantic information between all levels. This approach enables the use of multi-turn datasets which are a more natural conversational environment than single utterance. We evaluate our model on two multi-turn datasets for which we are the first to conduct joint slot-filling and intent detection. Our model outperforms state-of-the-art joint models in slot filling and intent detection on multi-turn data sets. We provide an analysis of explicit interaction locations between the layers. We conclude that including domain information improves model performance.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Henry Weld (4 papers)
  2. Sijia Hu (1 paper)
  3. Siqu Long (18 papers)
  4. Josiah Poon (41 papers)
  5. Soyeon Caren Han (48 papers)
Citations (1)