Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

HPT: Hierarchy-aware Prompt Tuning for Hierarchical Text Classification (2204.13413v2)

Published 28 Apr 2022 in cs.CL

Abstract: Hierarchical text classification (HTC) is a challenging subtask of multi-label classification due to its complex label hierarchy. Recently, the pretrained LLMs (PLM)have been widely adopted in HTC through a fine-tuning paradigm. However, in this paradigm, there exists a huge gap between the classification tasks with sophisticated label hierarchy and the masked LLM (MLM) pretraining tasks of PLMs and thus the potentials of PLMs can not be fully tapped. To bridge the gap, in this paper, we propose HPT, a Hierarchy-aware Prompt Tuning method to handle HTC from a multi-label MLM perspective. Specifically, we construct a dynamic virtual template and label words that take the form of soft prompts to fuse the label hierarchy knowledge and introduce a zero-bounded multi-label cross entropy loss to harmonize the objectives of HTC and MLM. Extensive experiments show HPT achieves state-of-the-art performances on 3 popular HTC datasets and is adept at handling the imbalance and low resource situations. Our code is available at https://github.com/wzh9969/HPT.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Zihan Wang (181 papers)
  2. Peiyi Wang (48 papers)
  3. Tianyu Liu (177 papers)
  4. Binghuai Lin (20 papers)
  5. Yunbo Cao (43 papers)
  6. Zhifang Sui (89 papers)
  7. Houfeng Wang (43 papers)
Citations (43)

Summary

We haven't generated a summary for this paper yet.