Papers
Topics
Authors
Recent
Search
2000 character limit reached

OceanSplat: 3D Gaussian Splatting Underwater

Updated 15 January 2026
  • OceanSplat is a family of frameworks that use physics-informed 3D Gaussian Splatting to reconstruct, simulate, and model underwater phenomena.
  • It employs channel-decoupled learning and adaptive regularization to accurately restore colors and geometry in challenging aquatic conditions.
  • It integrates sparse-view optimization and edge-aware techniques to ensure robust, real-time reconstructions and simulations in variable oceanic settings.

OceanSplat refers to a family of techniques and frameworks that leverage 3D Gaussian Splatting—an explicit, differentiable, point-based scene representation—for dense reconstruction, simulation, or modeling of oceanic and underwater phenomena. The term encompasses diverse subfields: high-fidelity underwater scene rendering and restoration in complex aquatic media; real-time geometric and visual reconstruction from sparse or degraded viewpoints; and simulation of physical processes such as sea spray and water surface dynamics. OceanSplat implementations are technically rigorous, combining spatially adaptive image formation models, channel-wise physical parameter learning, advanced loss regularization, and specialized data acquisition strategies for maritime environments.

1. Foundations: 3D Gaussian Splatting and Underwater Extensions

3D Gaussian Splatting (3DGS) forms the mathematical and algorithmic core of OceanSplat frameworks. Scenes are represented as sets of anisotropic Gaussian primitives {Gi}\{G_i\}, each with spatial mean μiR3\mu_i \in \mathbb{R}^3, covariance ΣiR3×3\Sigma_i \in \mathbb{R}^{3\times 3}, opacity αi[0,1]\alpha_i \in [0,1], and view-dependent color ci(v)R3c_i(v) \in \mathbb{R}^3. On rendering, each Gaussian projects to the image plane as Gi(x)=exp(12(xμi)Σi1(xμi))G_i(x) = \exp(-\frac12(x-\mu_i)^\top\Sigma_i^{-1}(x-\mu_i)). The final rendered color composite is achieved by front-to-back alpha blending:

C(x)=i=1Nαici(v)j<i(1αj),C(x) = \sum_{i=1}^{N} \alpha_i \cdot c_i(v) \prod_{j<i}(1 - \alpha_j),

where ci(v)c_i(v) is typically expanded in spherical harmonics.

OceanSplat adapts 3DGS for underwater environments by integrating a physics-driven attenuation and scattering model. The underwater image formation follows

Ich=JchTchD(z)+Bch(1TchB(z)),I_{ch} = J_{ch} \cdot T^D_{ch}(z) + B^\infty_{ch} \cdot (1 - T^B_{ch}(z)),

with channel-wise transmission TchD(z)=exp(βchdz)T^D_{ch}(z) = \exp(-\beta^d_{ch} \cdot z) (direct) and TchB(z)=exp(βchbz)T^B_{ch}(z) = \exp(-\beta^b_{ch} \cdot z) (backscatter), incorporating per-channel attenuation and backscatter coefficients, and spatial/temporal depth zz (Jiang et al., 21 May 2025, Yang et al., 2024, Li et al., 13 Aug 2025, Li et al., 2024).

2. Physically-Informed Channel-Decoupled Learning

OceanSplat enhances color and geometry reconstruction in highly attenuating, color-shifting environments by decoupling RGB channel learning. A dedicated multilayer perceptron (MLP) is trained to estimate, for each Gaussian and for each color channel, the attenuation and backscatter coefficients, as well as backscatter color. This channel-specific adaptation enables per-ray affine correction of Gaussian appearance:

ci,chm(v)=Ti,chDci,ch(v)+(1Ti,chB)bi,chc^m_{i,ch}(v) = T^D_{i,ch} \cdot c_{i,ch}(v) + (1 - T^B_{i,ch}) \cdot b_{i,ch}

where ci,ch(v)c_{i,ch}(v) is the raw SH-based color and the transformation's coefficients are learned in a depth- and channel-aware manner (Jiang et al., 21 May 2025, Li et al., 13 Aug 2025). The result is robust recovery of natural hues and geometric consistency, even under severe water color casts and spectral selectivity.

Loss terms include:

  • channel-wise reconstruction loss LrecL_\text{rec},
  • depth supervisions LdepthL_\text{depth},
  • grey-world color-balance LgL_g.

Dual-path extensions such as those in DualPhys-GS decompose the formation into direct transmission (attenuation) and multi-scale, feature-augmented scattering (backscatter), each parameterized by channel-aware, depth-aware feature networks, enabling nuanced handling across clear and turbid water types (Li et al., 13 Aug 2025).

3. Sparse-View Regularization and Optimization Strategies

OceanSplat incorporates several innovations to improve sparse-view and degraded-input robustness:

  • Intermediate Frame Interpolation (IFI): A real-time optical-flow-based method (e.g., RIFE) inserts interpolated frames between input views, significantly boosting initial Gaussian density (by ~55% on typical sequences), increasing geometric completeness (Jiang et al., 21 May 2025).
  • Adaptive Frame Weighting (AFW): Learnable uncertainty parameters γf\gamma_f per frame dynamically reweight their contributions in the loss, down-weighting unreliable or artifact-prone interpolations via Lf=12γfLf12αlogγfL'_f = \frac12\gamma_f L_f - \frac12 \alpha \log \gamma_f.
  • Edge-aware Smoothness Loss (ESL): Enforces regularization that is strong in smooth (low depth-gradient) regions, but relaxed at high-gradient (edge) regions, thus suppressing noise without blurring object boundaries:

Lsmooth(I,D)=i,j[wi,jxIi,jIi,j+1+wi,jyIi,jIi+1,j],L_\text{smooth}(I, D) = \sum_{i,j} [w^x_{i,j}|I_{i,j} - I_{i,j+1}| + w^y_{i,j}|I_{i,j} - I_{i+1,j}|],

with weights wi,j=exp(λbD)w^\ast_{i,j} = \exp(-\lambda_b|\nabla D|) (Jiang et al., 21 May 2025).

  • Trinocular View Consistency: Recent advances introduce trinocular consistency constraints by rendering synthetic horizontally and vertically translated views, enforcing inverse-warped photometric alignment. These synthetic baselines facilitate a self-supervised, synthetic epipolar depth prior for geometric regularization. Depth-aware alpha adjustment MLPs further suppress spurious “floating” Gaussians in non-object, medium-dominated regions (Kweon et al., 8 Jan 2026).

4. Integration of Physical Models, Losses, and Adaptive Controls

OceanSplat models explicitly respect the multi-faceted physical properties of underwater and oceanic light transport. This includes:

  • Attenuation–Scattering Consistency Loss: Penalizes violations of the expected anti-correlation of transmission and scattering with depth, and enforces S(x)+T(x)1S(x) + T(x) \approx 1 for energy conservation (Li et al., 13 Aug 2025).
  • Water-body-type Adaptive Loss: A learned classifier predicts clear, medium, or turbid regime and dynamically reweights loss terms and learning rates to optimize for the prevailing water conditions. Channel/path-specific LR schedules further tailor the optimization trajectory (Li et al., 13 Aug 2025).
  • Edge-aware and Multi-scale Losses: Ensure retention of structural details via per-pixel and per-scale supervision, enabling both global fidelity and local sharpness.

Auxiliary losses such as backscatter-prior penalties, color balance, and depth-consistency terms regularly appear, often inherited from the base 3DGS or NeRF literature but adapted for spectral specificity and aquatic noise (Yang et al., 2024, Shi et al., 3 May 2025).

5. Evaluation, Benchmarking, and Datasets

OceanSplat methods report strong, quantifiable improvements over both volumetric and explicit-only baselines:

Method PSNR Gain (dB) SSIM Δ LPIPS Δ Runtime/Fps
OceanSplat/RUSplat +1.90 +0.05 −0.05 \simreal-time
DualPhys-GS +0.8 (IUI3) within 1% SOTA −0.03 0.016 s/frame
SeaSplat +2.18 +0.09 −0.15 80 FPS
WaterSplatting see Table 5 0.93 0.125 41.8 FPS
AquaGS +0.43 +0.04 −0.036 \sim23–37 s/scene

Qualitatively, OceanSplat reconstructions retain more fine structure, sharper textures, and more accurate geometric details, particularly in low SNR, highly turbid, or blue/green-cast environments. Metric gains are maintained across different benchmarks (Submerged3D, SeaThru-NeRF, BlueCoral3D, SaltPond), and the explicit decoupling of geometry and medium enables both in-medium rendering and color-restored outputs without retraining (Jiang et al., 21 May 2025, Yang et al., 2024, Li et al., 13 Aug 2025, Kweon et al., 8 Jan 2026).

6. Applications, Limitations, and Prospective Extensions

OceanSplat’s physics-driven, channel-specific, and noise-robust design yields immediate benefits for:

  • Autonomous underwater vehicle (AUV) navigation and seabed inspection
  • Archaeological mapping of submerged objects and shipwrecks
  • Habitat and coral reef reconstruction in marine biology
  • Long-term environmental monitoring and change detection

Limitations. OceanSplat architectures often assume homogeneous water properties per scene; spatially varying turbidity would require spatially resolved attenuation/scattering models (e.g., MLPs over β(λ,x)\beta(\lambda, x)) (Jiang et al., 21 May 2025, Li et al., 13 Aug 2025). In dynamic scenes or where pseudo-depth is unreliable, floating artifacts or geometric inconsistencies can arise. Frame interpolation can introduce view-specific artifacts and relies on robust uncertainty weighting to suppress these. Memory usage remains substantial when modeling highly granular scenes.

Possible extensions include coupling with Neural Radiance Fields for hybrid representations, anisotropic scattering kernels, integration of multi-modal (e.g., sonar, inertial) inputs, and dynamic scene handling with temporal priors (Jiang et al., 21 May 2025, Shi et al., 3 May 2025, Li et al., 13 Aug 2025).

7. Other Contexts: OceanSplat in Physical Oceanography and Spray Modeling

The OceanSplat moniker also appears in the simulation of oceanic surface phenomena and sea-spray microphysics. In computational fluid dynamics, OceanSplat is associated with hybrid FLIP–BEM simulation (Fluid-Implicit Particle/Boundary Element Method) for capturing the detailed dynamics of ship-generated splashes alongside long-range deep-water waves, with adaptive remeshing, subcycling, and multi-solver coupling (Huang et al., 2021).

In physical oceanography, OceanSplat refers to models of sea-spray generation by wave splashing, built on microphysical rim-collision events in breaking waves. These models predict drop-size distributions and spray generation functions parameterized by wave spectra and rim-collision statistics, forming inputs to coupled ocean–atmosphere simulations (Tang et al., 2 Oct 2025). Limitations relate to the two-dimensionalization of real splashes and parameter uncertainties inherent in field conditions.


OceanSplat thus defines a class of physics-informed, channel-aware Gaussian-splatting pipelines for underwater and oceanic scene reconstruction, restoration, and simulation. Central advances include robust decoupling of attenuation and scattering, adaptive contextual optimization, sparse-view and edge-aware regularization, and rigorous benchmarking on domain-specific datasets (Jiang et al., 21 May 2025, Yang et al., 2024, Kweon et al., 8 Jan 2026, Li et al., 13 Aug 2025, Huang et al., 2021, Tang et al., 2 Oct 2025).

Topic to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to OceanSplat.