2000 character limit reached
Neural Lattice Decoders
Published 2 Jul 2018 in cs.IT and math.IT | (1807.00592v3)
Abstract: Lattice decoders constructed with neural networks are presented. Firstly, we show how the fundamental parallelotope is used as a compact set for the approximation by a neural lattice decoder. Secondly, we introduce the notion of Voronoi-reduced lattice basis. As a consequence, a first optimal neural lattice decoder is built from Boolean equations and the facets of the Voronoi cell. This decoder needs no learning. Finally, we present two neural decoders with learning. It is shown that L1 regularization and {\em a priori} information about the lattice structure lead to a simplification of the model.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.