Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

BERT Fine-tuning For Arabic Text Summarization (2004.14135v1)

Published 29 Mar 2020 in cs.CL and cs.LG

Abstract: Fine-tuning a pretrained BERT model is the state of the art method for extractive/abstractive text summarization, in this paper we showcase how this fine-tuning method can be applied to the Arabic language to both construct the first documented model for abstractive Arabic text summarization and show its performance in Arabic extractive summarization. Our model works with multilingual BERT (as Arabic language does not have a pretrained BERT of its own). We show its performance in English corpus first before applying it to Arabic corpora in both extractive and abstractive tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Khalid N. Elmadani (5 papers)
  2. Mukhtar Elgezouli (1 paper)
  3. Anas Showk (1 paper)
Citations (22)