Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

GREEK-BERT: The Greeks visiting Sesame Street (2008.12014v2)

Published 27 Aug 2020 in cs.CL

Abstract: Transformer-based LLMs, such as BERT and its variants, have achieved state-of-the-art performance in several downstream NLP tasks on generic benchmark datasets (e.g., GLUE, SQUAD, RACE). However, these models have mostly been applied to the resource-rich English language. In this paper, we present GREEK-BERT, a monolingual BERT-based LLM for modern Greek. We evaluate its performance in three NLP tasks, i.e., part-of-speech tagging, named entity recognition, and natural language inference, obtaining state-of-the-art performance. Interestingly, in two of the benchmarks GREEK-BERT outperforms two multilingual Transformer-based models (M-BERT, XLM-R), as well as shallower neural baselines operating on pre-trained word embeddings, by a large margin (5%-10%). Most importantly, we make both GREEK-BERT and our training code publicly available, along with code illustrating how GREEK-BERT can be fine-tuned for downstream NLP tasks. We expect these resources to boost NLP research and applications for modern Greek.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. John Koutsikakis (2 papers)
  2. Ilias Chalkidis (40 papers)
  3. Prodromos Malakasiotis (22 papers)
  4. Ion Androutsopoulos (51 papers)
Citations (83)