Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Skeletonisation Algorithms with Theoretical Guarantees for Unorganised Point Clouds with High Levels of Noise (1901.03319v3)

Published 10 Jan 2019 in cs.CG

Abstract: Data Science aims to extract meaningful knowledge from unorganised data. Real datasets usually come in the form of a cloud of points with only pairwise distances. Numerous applications require to visualise an overall shape of a noisy cloud of points sampled from a non-linear object that is more complicated than a union of disjoint clusters. The skeletonisation problem in its hardest form is to find a 1-dimensional skeleton that correctly represents a shape of the cloud. This paper compares several algorithms that solve the above skeletonisation problem for any point cloud and guarantee a successful reconstruction. For example, given a highly noisy point sample of an unknown underlying graph, a reconstructed skeleton should be geometrically close and homotopy equivalent to (has the same number of independent cycles as) the underlying graph. One of these algorithm produces a Homologically Persistent Skeleton (HoPeS) for any cloud without extra parameters. This universal skeleton contains sub-graphs that provably represent the 1-dimensional shape of the cloud at any scale. Other subgraphs of HoPeS reconstruct an unknown graph from its noisy point sample with a correct homotopy type and within a small offset of the sample. The extensive experiments on synthetic and real data reveal for the first time the maximum level of noise that allows successful graph reconstructions.

Summary

We haven't generated a summary for this paper yet.