Robust remote glimpse extraction under occlusion and adverse viewpoints

Develop computer vision methods that enable reliable estimation of motion glimpse sequences from remote camera feeds when robots are partially obscured by other moving bodies, affected by unfavorable camera angles, or subject to self-occlusion, so that Colored Noise Coherency (CoNoCo) watermarks remain remotely detectable from such degraded observations.

Background

The paper formalizes remote watermark detection for robotic policies using only external observations, modeled as glimpse sequences derived from sensors such as video feeds. CoNoCo embeds a frequency-band signature into the policy’s stochastic exploration and detects it via spectral coherency, which is robust to unknown system dynamics.

All experiments used stationary top- or side-view cameras with full visibility. The authors note that real-world deployments may involve partial occlusion, poor camera angles, or self-occlusion, which degrade the quality of motion estimates from video and challenge remote detection. Addressing these cases requires advanced computer vision techniques to robustly extract motion glimpses despite visual obstructions and viewpoint limitations.

References

Such setups would make it challenging to extract reliable motion glimpse estimates. Addressing this limitation would require more advanced computer vision techniques, which are beyond the scope of this work and are left for future study.

Remotely Detectable Robot Policy Watermarking  (2512.15379 - Amir et al., 17 Dec 2025) in Appendix, Section "Open Questions and Limitations"