Papers
Topics
Authors
Recent
Search
2000 character limit reached

Beyond Bits: An Introduction to Computation over the Reals

Published 31 Mar 2026 in cs.CC, cs.CG, and cs.DS | (2603.29427v1)

Abstract: We introduce a lightweight and accessible approach to computation over the real numbers, with the aim of clarifying both the underlying concepts and their relevance in modern research. The material is intended for a broad audience, including instructors who wish to incorporate real computation into algorithms courses, their students, and PhD students encountering the subject for the first time. Rather than striving for completeness, we focus on a carefully selected set of results that can be presented and proved in a classroom setting. This allows us to highlight core techniques and recurring ideas while maintaining an approachable exposition. In some places, the presentation is intentionally informal, prioritizing intuition and practical understanding over full technical precision. We position our exposition relative to existing literature, including Matousek's lecture notes on ER-completeness and the recent compendium of ER-complete problems by Schaefer, Cardinal, and Miltzow. While these works provide deep and comprehensive perspectives, our goal is to offer an accessible entry point with proofs and examples suitable for teaching. Our approach follows modern formulations of real computation that emphasize binary input, real-valued witnesses, and restricted use of constants, aligning more closely with contemporary complexity theory, while acknowledging the foundational contributions of the Blum--Shub--Smale model.

Authors (1)

Summary

  • The paper introduces real-number computational models that treat continuous data as primary objects to bridge algebraic circuit evaluation with geometric problem hardness.
  • It rigorously compares classical discrete models with extended real RAM and Turing machine models, highlighting constant-time arithmetic and precise operational semantics.
  • It demonstrates the ∃ℝ-completeness of key geometric problems, using algebraic gadget reductions to show practical implications in areas like the art gallery problem.

An Expert Summary of "Beyond Bits: An Introduction to Computation over the Reals" (2603.29427)

Introduction and Motivation

The manuscript provides a methodical exposition of the models of computation over the real numbers, their conceptual motivations, and the implications for computational complexity, particularly focusing on the R\exists\mathbb{R} complexity class. This approach deliberately diverges from the traditional bit-based (discrete) computational model to consider models where real numbers are first-class computational objects and the nature of efficient computation can be explored in a continuous domain. This is crucial for fields such as computational geometry and algebraic complexity, where many natural problems inherently resist bit-level encodings.

Computational Models: Word RAM, Turing Machine, and Extensions

The essay rigorously formulates the classical word RAM and Turing machine as reference models for discrete computation. The discussion focuses on the precision and unambiguity afforded by the word RAM, balancing formal operational semantics and practical hardware correspondence. The word RAM enables constant-time arithmetic and addressing on finite words, typically of size Θ(logn)\Theta(\log n), providing a platform for precise operation counting and algorithmic complexity analysis that abstracts hardware peculiarities.

The Turing machine serves as a mathematically minimalist and maximally robust model, reinforcing the foundational equivalence (up to polynomial factors) of diverse discrete computational frameworks. The translation between descriptions in high-level pseudocode and the formal operational semantics on these models is emphasized as key for both rigorous analysis and pedagogical clarity.

The extension to computation with real numbers gives rise to the real RAM and real Turing machine models. Here, registers may hold arbitrary real values, and finite sequences of arithmetic operations (++, -, ×\times, ÷\div, comparisons) are permitted as constant-time steps. Crucially, the model excludes unrestricted access to the bitwise representation or transcendental functions of the reals, to avoid pathological collapses of complexity (as insightfully demonstrated by the polynomial-time "factoring" algorithm possible with unbounded rounding access, originally due to Shamir).

The Existential Theory of the Reals (R\exists\mathbb{R}) and the PosSLP Problem

The foundational complexity class R\exists\mathbb{R} formalizes the set of decision problems reducible in polynomial time to the existential theory of the reals (ETR): the feasibility of polynomial constraint systems over R\mathbb{R}. The paper details two orthogonal but equivalent definitions:

  1. Logical Formulation: Decision problems encoded as satisfiability of existential sentences over real variables, with polynomials as atomic predicates.
  2. Machine Model: Problems where membership admits a real RAM polynomial-time verifier with real-valued witnesses, reflecting the logical structure via explicit operational mechanics.

This complexity class captures a robust generalization of NP, extending classical certification from discrete/binary witnesses to real-valued certificates and accommodating problems whose solutions (witnesses) may inherently involve irrational coordinates.

A crucial result is the tight connection to the PosSLP problem: determining the sign of a number computed by a straight-line program using +,,×+, -, \times. Patently, Θ(logn)\Theta(\log n)0, with the PosSLP oracle precisely bridging the computational gap between real and bit models for polynomial-time decidability. This equivalence has deep implications for the complexity-theoretic landscape, underscoring the central role of algebraic circuit evaluation in the hierarchy between P, NP, and Θ(logn)\Theta(\log n)1 [allender2009complexity].

Real Computation Hardness: Order Types, Stretchability, and Geometric Representations

The manuscript discusses the seminal Mnëv–Shor universality theorems, which imply that the realization space of certain combinatorial geometric structures (order types, pseudoline arrangements) is as complex as any semialgebraic set. The stretchability problem for pseudoline arrangements (determining if a collection of pseudolines is homeomorphic to a straight-line arrangement) is Θ(logn)\Theta(\log n)2-complete, furnishing the first canonical "hard" problems for the class [M85, S91].

Crucially, the construction reduces ETR-AM constraints (conjunctions of Θ(logn)\Theta(\log n)3, Θ(logn)\Theta(\log n)4, Θ(logn)\Theta(\log n)5) to partial order type realizability. Algebraic constraints are interpreted geometrically via gadgets employing the van Staudt symbolic construction, which encodes arithmetic among collinear points with projective invariances. The reduction leverages duality and projective transformations to model both additive and multiplicative relations entirely within the incidence structure. The reduction to full order type realization and stretchability requires further handling of intersection and separation invariants, as well as projectivization to obtain the necessary flexibility without losing computability.

Geometric representation problems, such as unit disk intersection graph recognition and optimal curve straightening, are established as Θ(logn)\Theta(\log n)6-complete, demonstrating the ubiquity of continuous hardness in geometric algorithmics and underscoring that even very "natural" geometric combinatorics inherit the full intractability of solving polynomial systems over the reals.

Subsequent work refines the algebraic constraint requirements for Θ(logn)\Theta(\log n)7-hardness. The investigation of ETRINV (where only bounded addition and inversion, Θ(logn)\Theta(\log n)8, are permitted) demonstrates that even highly restricted forms of algebraic computation remain Θ(logn)\Theta(\log n)9-complete. Essential technical tools involve compactification (reductions to constraints over compact domains) and algebraic transforms that allow the simulation of multiplication and squaring with inversion and addition, enabling geometric embedding of variable gadgets into bounded segments.

A direct application arises in the art gallery problem: determining the minimal number of guards for a polygon, where even for simple polygons, the problem's solution set may be realized only with irrational positions, manifesting the necessity of ++0 membership and precluding NP-membership in general [abrahamsen2022artgallery, abrahamsen2017irrational]. The paper illustrates how variable segments and inversion constraints are encoded as geometric visibility constraints, using intricate polygon constructions to preclude rational approximations and ensure the full generality of semialgebraic set representation.

Theoretical and Practical Implications

The adoption of real computation models and the ++1 class enables rigorous complexity analysis for entire domains of geometric, topological, and algebraic problems that had previously resisted classification in the Turing/word RAM model. Problems in computational geometry and network representation, as well as certain instances of numerical analysis and real algebraic geometry, can now be situated within the hierarchy and be subject to fine-grained reductions.

From a practical perspective, the theory motivates careful scrutiny whenever exact or even approximate computation with real numbers is abstracted away. Results proven in real RAM models must be transported to bit models only with explicit attention to bit-lengths and encoding, given the nontrivial gap between the two worlds. Conversely, there are classes of natural problems where the real model is manifestly more appropriate (e.g., determining geometric realizability, stretchability, or packing), and computational intractability there is best understood in ++2 terms.

Conclusion

This manuscript meticulously introduces the theory of computation over the reals, showing its practical necessity and theoretical depth for analyzing problems beyond the classical discrete paradigm. By connecting fundamental problems in geometry, logic, and circuit evaluation to the ++3 framework, it exposes the pervasive algebraic complexity at the heart of many natural computational problems. As a result, future advances in both theory and practical algorithms—especially those in combinatorial geometry, symbolic computation, and optimization—must account for the profound influence of semialgebraic complexity and the boundaries drawn by computation over the reals.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 14 likes about this paper.