Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

DrBERT: A Robust Pre-trained Model in French for Biomedical and Clinical domains (2304.00958v2)

Published 3 Apr 2023 in cs.CL

Abstract: In recent years, pre-trained LLMs (PLMs) achieve the best performance on a wide range of NLP tasks. While the first models were trained on general domain data, specialized ones have emerged to more effectively treat specific domains. In this paper, we propose an original study of PLMs in the medical domain on French language. We compare, for the first time, the performance of PLMs trained on both public data from the web and private data from healthcare establishments. We also evaluate different learning strategies on a set of biomedical tasks. In particular, we show that we can take advantage of already existing biomedical PLMs in a foreign language by further pre-train it on our targeted data. Finally, we release the first specialized PLMs for the biomedical field in French, called DrBERT, as well as the largest corpus of medical data under free license on which these models are trained.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Yanis Labrak (12 papers)
  2. Adrien Bazoge (6 papers)
  3. Richard Dufour (33 papers)
  4. Mickael Rouvier (25 papers)
  5. Emmanuel Morin (13 papers)
  6. Pierre-Antoine Gourraud (5 papers)
  7. Béatrice Daille (10 papers)
Citations (51)

Summary

We haven't generated a summary for this paper yet.