Papers
Topics
Authors
Recent
Search
2000 character limit reached

Noise-Robust Wrapped-Number Search

Updated 16 December 2025
  • Noise-robust wrapped-number search is a statistical approach for reconstructing latent variables from modular measurements affected by significant noise.
  • It employs advanced MAP estimators, wrapped Bayesian mixture models, and error-correcting clustering to achieve accurate recovery in high-noise and cyclic settings.
  • The framework enhances applications in phase unwrapping, angular data regression, and RFID localization by addressing ambiguity through modular arithmetic and robust inference.

Noise-robust wrapped-number search refers to a suite of statistical and algorithmic techniques for the recovery or localization of real or discrete variables whose observable signatures are “wrapped” (subject to modular arithmetic) and received in the presence of significant noise. This problem arises in diverse contexts, including ambiguity resolution for phase unwrapping, robust Chinese Remainder Theorem (CRT) reconstruction, angular data regression, and search on cyclic graphs. Rigorous advances focus on maximum a posteriori (MAP) estimators for clustering and parameter inference, Bayesian mixture models adapted to wrapped noise, robust likelihoods, and algorithmic search procedures designed to approach information-theoretic boundaries in noisy modular or cyclic environments.

1. Statistical Formulations and Noise Models

Wrapped-number search involves reconstructing latent variables xix_i from noisy observations “wrapped” modulo given intervals, i.e., observed as ri,=(xi+Δi,)modmr_{i,\ell} = (x_i + \Delta_{i,\ell}) \bmod m_\ell where Δi,\Delta_{i,\ell} are independent noise variables. Typical models assume:

  • Noise Δi,\Delta_{i,\ell} is Gaussian, wrapped to [0,m)[0, m_\ell), with variance σ2\sigma_\ell^2.
  • The unknowns xi[0,D)x_i\in[0,D) admit a modular decomposition: xi=kiΓ+μix_i = k_i\Gamma + \mu_i with μi[0,Γ)\mu_i\in[0,\Gamma), kik_i integral, and ri,=(xi+Δi,)modmr_{i,\ell} = (x_i + \Delta_{i,\ell}) \bmod m_\ell0 the common modulus scale.
  • Prior distributions are uniform on both ri,=(xi+Δi,)modmr_{i,\ell} = (x_i + \Delta_{i,\ell}) \bmod m_\ell1 and ri,=(xi+Δi,)modmr_{i,\ell} = (x_i + \Delta_{i,\ell}) \bmod m_\ell2.
  • The problem generalizes to settings where the wrapped output is an angular variable, as in robust Gaussian process regression for circular data using a wrapped transformation (Cooper et al., 29 Nov 2025).

This statistical wrapping framework enables rigorous modeling via likelihoods of the form:

ri,=(xi+Δi,)modmr_{i,\ell} = (x_i + \Delta_{i,\ell}) \bmod m_\ell3

where all calculations are performed modulo the relevant cycle or interval.

2. MAP Estimation and Residue Clustering

A fundamental challenge is the assignment of noisy residues ri,=(xi+Δi,)modmr_{i,\ell} = (x_i + \Delta_{i,\ell}) \bmod m_\ell4 to the latent variables ri,=(xi+Δi,)modmr_{i,\ell} = (x_i + \Delta_{i,\ell}) \bmod m_\ell5, termed residue clustering. Two principal MAP-based approaches are established (Xiao et al., 2019, Du et al., 2018):

  • Conditional Clustering MAP: For fixed candidate assignments ri,=(xi+Δi,)modmr_{i,\ell} = (x_i + \Delta_{i,\ell}) \bmod m_\ell6, the conditional likelihood ri,=(xi+Δi,)modmr_{i,\ell} = (x_i + \Delta_{i,\ell}) \bmod m_\ell7 integrates over ri,=(xi+Δi,)modmr_{i,\ell} = (x_i + \Delta_{i,\ell}) \bmod m_\ell8 to yield a sum of Gaussian terms. Under nonoverlap assumptions (arcs of span ri,=(xi+Δi,)modmr_{i,\ell} = (x_i + \Delta_{i,\ell}) \bmod m_\ell9), the MAP clustering reduces to combinatorial optimization over cyclic “cut points” Δi,\Delta_{i,\ell}0, transforming the problem to sorting, matching, and evaluating quadratic forms.
  • Joint MAP via Wrapped Gaussian Mixture EM: This approach alternates between residue assignment (E-step: minimizing summed circular distances between residues and current centroids) and centroid update (M-step: solving for the mean that minimizes aggregate squared circular distance), applying circular distance metrics Δi,\Delta_{i,\ell}1. Each E-step can be solved efficiently via cyclic shifts of sorted lists.

Both algorithms achieve significant practical gains in moderate and high noise scenarios, robustly recovering Δi,\Delta_{i,\ell}2 even as the number of unknowns grows.

3. Wrapped Bayesian Mixture Models and Robust Inference

Statistical wrapped-number search generalizes the classic deterministic CRT by interpreting residue observations as samples from a wrapped Gaussian mixture, where both residue-permutation (clustering) and latent modular remainders are unobserved variables (Du et al., 2018, Cooper et al., 29 Nov 2025). Bayesian inference proceeds by:

  • Assigning uniform or structured priors on clustering assignments and modular components.
  • Employing iterative maximization or Gibbs/MCMC sampling over both assignments and centroids.
  • In Gaussian-process-structured wrapped regression, constructing robust posteriors by combining latent process priors Δi,\Delta_{i,\ell}3, monotonic wrap constraints on the input space (through variable cut-points), and heavy-tailed noise models (Student’s Δi,\Delta_{i,\ell}4-likelihood) to increase outlier robustness (Cooper et al., 29 Nov 2025).

For GP-based angular regression, the joint posterior on the latent function, wrap count, and hyperparameters is sampled by hybrid Metropolis-Hastings and elliptical slice sampling algorithms, yielding predictive distributions that properly reflect modular uncertainty and noise.

4. Remainder Error-Correcting Codes and Majority Decoding

To further improve robustness against erroneous residue assignments and heavy-tailed noise, error-correcting-code-inspired techniques are integrated into wrapped-number search (Xiao et al., 2019, Du et al., 2018). Key components include:

  • Introducing redundancy by selecting Δi,\Delta_{i,\ell}5 moduli whose least common multiple (lcm) vastly exceeds the dynamic range Δi,\Delta_{i,\ell}6.
  • Defining a minimum Δi,\Delta_{i,\ell}7 such that lcmΔi,\Delta_{i,\ell}8.
  • Applying majority-vote over all (or a randomized subset of) the Δi,\Delta_{i,\ell}9 possible subgroups, accepting reconstructions from clusters where a sufficient number of assignments remain within a noise span Δi,\Delta_{i,\ell}0.
  • Resulting schemes can tolerate up to Δi,\Delta_{i,\ell}1 cluster errors per signal, with robustness guarantees of recovery error Δi,\Delta_{i,\ell}2.

This error-correcting residue coding is crucial in hostile or adversarial noise settings, where naive residue matching could otherwise fail catastrophically.

5. Algorithmic Complexity and Information-Theoretic Limits

Analysis of computational complexity and information-theoretic performance bounds reveals the following (Xiao et al., 2019, Du et al., 2018, Dereniowski et al., 2021):

  • Deterministic CRT and RCRT: Suffer from exponential failure in Δi,\Delta_{i,\ell}3 under moderate noise; require Δi,\Delta_{i,\ell}4 scaling with Δi,\Delta_{i,\ell}5 for stability; computational cost Δi,\Delta_{i,\ell}6.
  • Statistical MAP and EM approaches: Achieve complexity Δi,\Delta_{i,\ell}7 per trial (EM rounds typically Δi,\Delta_{i,\ell}8), supporting near-constant Δi,\Delta_{i,\ell}9 per [0,m)[0, m_\ell)0 independently of [0,m)[0, m_\ell)1 for robust recovery.
  • Noisy wrapped search on cycles (discrete case): On the [0,m)[0, m_\ell)2-cycle [0,m)[0, m_\ell)3, noisy search achieves expected query complexity

[0,m)[0, m_\ell)4

for comparison error rate [0,m)[0, m_\ell)5 and failure probability [0,m)[0, m_\ell)6, where [0,m)[0, m_\ell)7 is the information gain per query. Worst-case and expected-case bounds match information-theoretic lower bounds up to [0,m)[0, m_\ell)8, and search algorithms maintain polynomial time even under adversarial noise (Dereniowski et al., 2021).

These results establish that statistical and code-augmented approaches are strictly superior to deterministic schemes as [0,m)[0, m_\ell)9 increases, especially in heavy-noise or adversarial regimes.

6. Applications, Empirical Performance, and Design Practice

Applications of noise-robust wrapped-number search span array signal processing, phase unwrapping, RFID localization, and cyclic/binary search under modular constraints (Cooper et al., 29 Nov 2025). Empirical studies demonstrate:

  • For σ2\sigma_\ell^20 estimates and SNR near σ2\sigma_\ell^21 dB, deterministic CRT achieves σ2\sigma_\ell^22 average-signal recovery, while conditional MAP clustering achieves σ2\sigma_\ell^23 and joint MAP (EM) achieves σ2\sigma_\ell^24. At SNR σ2\sigma_\ell^25 dB, all approaches converge to σ2\sigma_\ell^26 average recovery.
  • Error-correcting extensions enable reliable recovery even with significant cluster errors, as long as the residue span is bounded.
  • Wrapped Gaussian process regression with one-directional wraps and heavy-tailed likelihoods outperforms prior methods under extreme noise and outlier presence, both in simulated data and real-world RFID phase-versus-frequency problems (Cooper et al., 29 Nov 2025).

Recommended practical guidelines include: choosing σ2\sigma_\ell^27; employing approximately equal-sized base moduli; minimizing σ2\sigma_\ell^28 by using large primes to enhance code-correctability; leveraging joint MAP (EM) when additional compute budget is available; and always applying error-correcting residue decoding for robust reconstruction even at the edge of stability (Xiao et al., 2019, Du et al., 2018).


References

Xiao et al., "Statistical Robust Chinese Remainder Theorem for Multiple Numbers" (Xiao et al., 2019) Xiao et al., "Statistical Robust Chinese Remainder Theorem for Multiple Numbers: Wrapped Gaussian Mixture Model" (Du et al., 2018) Dereniowski, Łukasiewicz, Uznański, "Noisy (Binary) Searching: Simple, Fast and Correct" (Dereniowski et al., 2021) Cooper et al., "Robust Wrapped Gaussian Process Inference for Noisy Angular Data" (Cooper et al., 29 Nov 2025)

Topic to Video (Beta)

Whiteboard

Follow Topic

Get notified by email when new papers are published related to Noise-Robust Wrapped-Number Search.