Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Iterative Robust Transformation Synchronization (2111.00728v1)

Published 1 Nov 2021 in cs.CV

Abstract: Transformation Synchronization is the problem of recovering absolute transformations from a given set of pairwise relative motions. Despite its usefulness, the problem remains challenging due to the influences from noisy and outlier relative motions, and the difficulty to model analytically and suppress them with high fidelity. In this work, we avoid handcrafting robust loss functions, and propose to use graph neural networks (GNNs) to learn transformation synchronization. Unlike previous works which use complicated multi-stage pipelines, we use an iterative approach where each step consists of a single weight-shared message passing layer that refines the absolute poses from the previous iteration by predicting an incremental update in the tangent space. To reduce the influence of outliers, the messages are weighted before aggregation. Our iterative approach alleviates the need for an explicit initialization step and performs well with identity initial poses. Although our approach is simple, we show that it performs favorably against existing handcrafted and learned synchronization methods through experiments on both SO(3) and SE(3) synchronization.

Citations (16)

Summary

We haven't generated a summary for this paper yet.