Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Topic Spotting using Hierarchical Networks with Self Attention (1904.02815v1)

Published 4 Apr 2019 in cs.CL, cs.AI, and cs.LG

Abstract: Success of deep learning techniques have renewed the interest in development of dialogue systems. However, current systems struggle to have consistent long term conversations with the users and fail to build rapport. Topic spotting, the task of automatically inferring the topic of a conversation, has been shown to be helpful in making a dialog system more engaging and efficient. We propose a hierarchical model with self attention for topic spotting. Experiments on the Switchboard corpus show the superior performance of our model over previously proposed techniques for topic spotting and deep models for text classification. Additionally, in contrast to offline processing of dialog, we also analyze the performance of our model in a more realistic setting i.e. in an online setting where the topic is identified in real time as the dialog progresses. Results show that our model is able to generalize even with limited information in the online setting.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Pooja Chitkara (8 papers)
  2. Ashutosh Modi (60 papers)
  3. Pravalika Avvaru (2 papers)
  4. Sepehr Janghorbani (4 papers)
  5. Mubbasir Kapadia (37 papers)
Citations (8)