Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

HAXMLNet: Hierarchical Attention Network for Extreme Multi-Label Text Classification (1904.12578v1)

Published 24 Mar 2019 in cs.IR, cs.LG, and stat.ML

Abstract: Extreme multi-label text classification (XMTC) addresses the problem of tagging each text with the most relevant labels from an extreme-scale label set. Traditional methods use bag-of-words (BOW) representations without context information as their features. The state-ot-the-art deep learning-based method, AttentionXML, which uses a recurrent neural network (RNN) and the multi-label attention, can hardly deal with extreme-scale (hundreds of thousands labels) problem. To address this, we propose our HAXMLNet, which uses an efficient and effective hierarchical structure with the multi-label attention. Experimental results show that HAXMLNet reaches a competitive performance with other state-of-the-art methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Ronghui You (4 papers)
  2. Zihan Zhang (121 papers)
  3. Suyang Dai (3 papers)
  4. Shanfeng Zhu (9 papers)
Citations (13)

Summary

We haven't generated a summary for this paper yet.