Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Semantic Embeddings in Semilattices (2205.12618v1)

Published 25 May 2022 in cs.DM, cs.LO, math.AC, math.LO, and math.RA

Abstract: To represent anything from mathematical concepts to real-world objects, we have to resort to an encoding. Encodings, such as written language, usually assume a decoder that understands a rich shared code. A semantic embedding is a form of encoding that assumes a decoder with no knowledge, or little knowledge, beyond the basic rules of a mathematical formalism such as an algebra. Here we give a formal definition of a semantic embedding in a semilattice which can be used to resolve machine learning and classic computer science problems. Specifically, a semantic embedding of a problem is here an encoding of the problem as sentences in an algebraic theory that extends the theory of semilattices. We use the recently introduced formalism of finite atomized semilattices to study the properties of the embeddings and their finite models. For a problem embedded in a semilattice, we show that every solution has a model atomized by an irreducible subset of the non-redundant atoms of the freest model of the embedding. We give examples of semantic embeddings that can be used to find solutions for the N-Queen's completion, the Sudoku, and the Hamiltonian Path problems.

Citations (3)

Summary

We haven't generated a summary for this paper yet.