Efficient multi-concept inference for training-free OVCD

Develop efficient inference schemes that enable CoRegOVCD and related training-free open-vocabulary change detection frameworks to handle multiple queried concepts simultaneously or with minimal computational overhead while preserving accuracy.

Background

The current pipeline performs concept-specific inference, which is effective and comparatively fast for single-query evaluation but can require repeated passes for multiple concepts. The efficiency analysis demonstrates competitive single-concept throughput, motivating research on mechanisms to amortize computation across concepts or to process multiple queries jointly without sacrificing accuracy.

References

Prompt robustness, threshold transfer, and efficient multi-concept inference remain open, but the central conclusion is clear: posterior differencing, once properly regularized, provides a stronger foundation for training-free OVCD.

CoRegOVCD: Consistency-Regularized Open-Vocabulary Change Detection  (2604.02160 - Tang et al., 2 Apr 2026) in Conclusion (final paragraph)