Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language (2208.01875v1)

Published 3 Aug 2022 in cs.CL

Abstract: We present a new pre-trained LLM (PLM) for Rabbinic Hebrew, termed Berel (BERT Embeddings for Rabbinic-Encoded Language). Whilst other PLMs exist for processing Hebrew texts (e.g., HeBERT, AlephBert), they are all trained on modern Hebrew texts, which diverges substantially from Rabbinic Hebrew in terms of its lexicographical, morphological, syntactic and orthographic norms. We demonstrate the superiority of Berel on Rabbinic texts via a challenge set of Hebrew homographs. We release the new model and homograph challenge set for unrestricted use.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Avi Shmidman (13 papers)
  2. Joshua Guedalia (3 papers)
  3. Shaltiel Shmidman (10 papers)
  4. Cheyn Shmuel Shmidman (3 papers)
  5. Eli Handel (1 paper)
  6. Moshe Koppel (16 papers)
Citations (4)