Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compressive Neural Representations of Volumetric Scalar Fields (2104.04523v1)

Published 11 Apr 2021 in cs.LG and cs.GR

Abstract: We present an approach for compressing volumetric scalar fields using implicit neural representations. Our approach represents a scalar field as a learned function, wherein a neural network maps a point in the domain to an output scalar value. By setting the number of weights of the neural network to be smaller than the input size, we achieve compressed representations of scalar fields, thus framing compression as a type of function approximation. Combined with carefully quantizing network weights, we show that this approach yields highly compact representations that outperform state-of-the-art volume compression approaches. The conceptual simplicity of our approach enables a number of benefits, such as support for time-varying scalar fields, optimizing to preserve spatial gradients, and random-access field evaluation. We study the impact of network design choices on compression performance, highlighting how simple network architectures are effective for a broad range of volumes.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Yuzhe Lu (22 papers)
  2. Kairong Jiang (2 papers)
  3. Joshua A. Levine (107 papers)
  4. Matthew Berger (22 papers)
Citations (74)

Summary

We haven't generated a summary for this paper yet.