Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Joint Embedding of Words and Category Labels for Hierarchical Multi-label Text Classification (2004.02555v3)

Published 6 Apr 2020 in cs.NE, cs.CL, and cs.LG

Abstract: Text classification has become increasingly challenging due to the continuous refinement of classification label granularity and the expansion of classification label scale. To address that, some research has been applied onto strategies that exploit the hierarchical structure in problems with a large number of categories. At present, hierarchical text classification (HTC) has received extensive attention and has broad application prospects. Making full use of the relationship between parent category and child category in text classification task can greatly improve the performance of classification. In this paper, We propose a joint embedding of text and parent category based on hierarchical fine-tuning ordered neurons LSTM (HFT-ONLSTM) for HTC. Our method makes full use of the connection between the upper-level and lower-level labels. Experiments show that our model outperforms the state-of-the-art hierarchical model at a lower computation cost.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Jingpeng Zhao (2 papers)
  2. Yinglong Ma (9 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.