Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Challenges of HTR Model Training: Feedback from the Project Donner le gout de l'archive a l'ere numerique (2212.11146v4)

Published 13 Dec 2022 in cs.CV and cs.LG

Abstract: The arrival of handwriting recognition technologies offers new possibilities for research in heritage studies. However, it is now necessary to reflect on the experiences and the practices developed by research teams. Our use of the Transkribus platform since 2018 has led us to search for the most significant ways to improve the performance of our handwritten text recognition (HTR) models which are made to transcribe French handwriting dating from the 17th century. This article therefore reports on the impacts of creating transcribing protocols, using the LLM at full scale and determining the best way to use base models in order to help increase the performance of HTR models. Combining all of these elements can indeed increase the performance of a single model by more than 20% (reaching a Character Error Rate below 5%). This article also discusses some challenges regarding the collaborative nature of HTR platforms such as Transkribus and the way researchers can share their data generated in the process of creating or training handwritten text recognition models.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Beatrice Couture (1 paper)
  2. Farah Verret (1 paper)
  3. Maxime Gohier (1 paper)
  4. Dominique Deslandres (1 paper)
Citations (1)

Summary

We haven't generated a summary for this paper yet.