Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Can Model Fusing Help Transformers in Long Document Classification? An Empirical Study (2307.09532v1)

Published 18 Jul 2023 in cs.CL

Abstract: Text classification is an area of research which has been studied over the years in NLP. Adapting NLP to multiple domains has introduced many new challenges for text classification and one of them is long document classification. While state-of-the-art transformer models provide excellent results in text classification, most of them have limitations in the maximum sequence length of the input sequence. The majority of the transformer models are limited to 512 tokens, and therefore, they struggle with long document classification problems. In this research, we explore on employing Model Fusing for long document classification while comparing the results with well-known BERT and Longformer architectures.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Damith Premasiri (10 papers)
  2. Tharindu Ranasinghe (52 papers)
  3. Ruslan Mitkov (15 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.