Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Sentence-level Hierarchical BERT Model for Document Classification with Limited Labelled Data (2106.06738v1)

Published 12 Jun 2021 in cs.CL

Abstract: Training deep learning models with limited labelled data is an attractive scenario for many NLP tasks, including document classification. While with the recent emergence of BERT, deep learning LLMs can achieve reasonably good performance in document classification with few labelled instances, there is a lack of evidence in the utility of applying BERT-like models on long document classification. This work introduces a long-text-specific model -- the Hierarchical BERT Model (HBM) -- that learns sentence-level features of the text and works well in scenarios with limited labelled data. Various evaluation experiments have demonstrated that HBM can achieve higher performance in document classification than the previous state-of-the-art methods with only 50 to 200 labelled instances, especially when documents are long. Also, as an extra benefit of HBM, the salient sentences identified by learned HBM are useful as explanations for labelling documents based on a user study.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Jinghui Lu (28 papers)
  2. Maeve Henchion (2 papers)
  3. Ivan Bacher (2 papers)
  4. Brian Mac Namee (36 papers)
Citations (22)

Summary

We haven't generated a summary for this paper yet.