Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
120 tokens/sec
GPT-4o
10 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
55 tokens/sec
2000 character limit reached

Assessing GPT Model Uncertainty in Mathematical OCR Tasks via Entropy Analysis (2412.01221v2)

Published 2 Dec 2024 in cs.IT and math.IT

Abstract: This paper investigates the uncertainty of Generative Pre-trained Transformer (GPT) models in extracting mathematical equations from images of varying resolutions and converting them into LaTeX code. We employ concepts of entropy and mutual information to examine the recognition process and assess the model's uncertainty in this Optical Character Recognition (OCR) task. By analyzing the conditional entropy of the output token sequences, we provide both theoretical insights and practical measurements of the GPT model's performance given different image qualities. Our experimental results, obtained using a Python implementation available on GitHub, demonstrate a clear relationship between image resolution and GPT model uncertainty. Higher-resolution images lead to lower entropy values, indicating reduced uncertainty and improved accuracy in the recognized LaTeX code. Conversely, lower-resolution images result in increased entropy, reflecting higher uncertainty and a higher likelihood of recognition errors. These findings highlight the practical importance of considering image quality in GPT-based mathematical OCR applications and demonstrate how entropy analysis, grounded in information-theoretic concepts, can effectively quantify model uncertainty in real-world tasks.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)