Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Data Incubation -- Synthesizing Missing Data for Handwriting Recognition (2110.07040v1)

Published 13 Oct 2021 in cs.CV and cs.LG

Abstract: In this paper, we demonstrate how a generative model can be used to build a better recognizer through the control of content and style. We are building an online handwriting recognizer from a modest amount of training samples. By training our controllable handwriting synthesizer on the same data, we can synthesize handwriting with previously underrepresented content (e.g., URLs and email addresses) and style (e.g., cursive and slanted). Moreover, we propose a framework to analyze a recognizer that is trained with a mixture of real and synthetic training data. We use the framework to optimize data synthesis and demonstrate significant improvement on handwriting recognition over a model trained on real data only. Overall, we achieve a 66% reduction in Character Error Rate.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Jen-Hao Rick Chang (18 papers)
  2. Martin Bresler (1 paper)
  3. Youssouf Chherawala (3 papers)
  4. Adrien Delaye (1 paper)
  5. Thomas Deselaers (7 papers)
  6. Ryan Dixon (1 paper)
  7. Oncel Tuzel (62 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.