Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Weighted Automata and Recurrence Equations for Regular Languages (1007.1045v2)

Published 7 Jul 2010 in cs.FL, cs.DM, and math.CO

Abstract: Let $\mathcal{P}(\Sigma*)$ be the semiring of languages, and consider its subset $\mathcal{P}(\Sigma)$. In this paper we define the language recognized by a weighted automaton over $\mathcal{P}(\Sigma)$ and a one-letter alphabet. Similarly, we introduce the notion of language recognition by linear recurrence equations with coefficients in $\mathcal{P}(\Sigma)$. As we will see, these two definitions coincide. We prove that the languages recognized by linear recurrence equations with coefficients in $\mathcal{P}(\Sigma)$ are precisely the regular languages, thus providing an alternative way to present these languages. A remarkable consequence of this kind of recognition is that it induces a partition of the language into its cross-sections, where the $n$th cross-section contains all the words of length $n$ in the language. Finally, we show how to use linear recurrence equations to calculate the density function of a regular language, which assigns to every $n$ the number of words of length $n$ in the language. We also show how to count the number of successful paths of a weighted automaton.

Citations (1)

Summary

We haven't generated a summary for this paper yet.