Papers
Topics
Authors
Recent
2000 character limit reached

Textured Gaussian Primitives

Updated 3 December 2025
  • Textured Gaussian Primitives are explicit, differentiable rendering representations that extend classical Gaussian splatting by integrating per-primitive textures for detailed scene synthesis.
  • They employ adaptive sampling and parameter allocation to optimize texture resolution and preserve high-frequency details across complex visual regions.
  • They enable advanced applications in neural rendering, view synthesis, and volumetric visualization while achieving ultra-compact representations and real-time performance.

Textured Gaussian Primitives constitute an explicit, differentiable rendering representation that augments classical Gaussian splatting with spatially varying appearance maps—enabling high-fidelity, real-time scene synthesis, ultra-compact geometry-appearance decoupling, and principled parameter allocation. They generalize each Gaussian primitive, traditionally limited to a single color and opacity per splat, by embedding per-primitive textures or spatially varying functions (e.g., RGBA maps, neural fields, material attribute charts). This framework has become central in modern neural rendering, view synthesis, and volume visualization, with numerous variants tailored for content-adaptive, frequency-aware, and physically based modeling (Xie et al., 28 Nov 2025, Papantonakis et al., 2 Dec 2025, Chao et al., 27 Nov 2024, Song et al., 2 Dec 2024, Tang et al., 18 Jul 2025, Svitov et al., 13 Nov 2024, Xu et al., 28 Nov 2024, Rong et al., 19 Sep 2024, Younes et al., 16 Jun 2025).

1. Formal Definition and Mathematical Construction

A textured Gaussian primitive is an extension of the standard Gaussian “splat”—either 2D in image/world space or 3D in volume space—with added spatially varying appearance channels. Let GiG_i denote a primitive with center μi\mu_i, covariance Σi\Sigma_i, and base color/opacity. Textured variants append either:

  • Explicit texture map: Ti:[0,U)×[0,V)RCT_i : [0, U) \times [0, V) \to \mathbb{R}^C (C=1 for alpha, C=3 for RGB, C=4 for RGBA)
  • Spatially varying function: e.g., bilinear interpolation over four learned kernel points, movable kernel aggregation, or tiny local neural networks (Xu et al., 28 Nov 2024)

For a ray–primitive intersection at local coordinates (u,v)(u,v), the pixel-wise color and opacity become: ci(u,v)=cibase+TiRGB(u,v),αi(u,v)=Tiα(u,v)Gi(u,v)oic_i(u,v) = c_i^{\text{base}} + T_i^{\text{RGB}}(u,v),\quad \alpha_i(u,v) = T_i^{\alpha}(u,v)\,G_i(u,v)\,o_i where TiRGB,TiαT_i^{\text{RGB}}, T_i^{\alpha} are texture channels, sampled via bilinear or more advanced adaptive schemes.

Texture coordinates are defined via geometric mapping from intersection or projection onto the support plane, scaled so that texture resolution matches the scene content or rendering frequency (Papantonakis et al., 2 Dec 2025). UV-mapping may be set so texel size in world units is fixed, or adaptively warped based on content complexity.

2. Parameterization Strategies and Adaptive Sampling

Uniform texture grids allocate sampling density equally throughout the Gaussian's support, which is inefficient in presence of heterogeneous visual complexity. FACT-GS introduced a differentiable, frequency-aligned sampling scheme: a local warp ϕ\phi reallocates texel density according to image-space gradients (Xie et al., 28 Nov 2025): ϕ(u)=u+λD(u)\phi(u) = u + \lambda D(u) where D(u)D(u) is a learnable offset field (MLP or convolutional grid), and the local sampling density is modulated by the Jacobian determinant detJϕ(u)|\det J_\phi(u)|. The optimal density, per sampling theory, aligns as

ρ(u)(C(u)+ϵ)α\rho^*(u) \propto \bigl(\|\nabla C(u)\|+\epsilon\bigr)^\alpha

with α>0\alpha > 0 and small ϵ>0\epsilon>0, thus concentrating texel budget where high-frequency detail is present.

Content-aware texturing schemes employ adaptive upscaling/downscaling and primitive splitting: the per-primitive texel-to-pixel ratio t2p,r,it_{2p,r,i} is monitored, coarse textures are low-pass filtered and merged when possible, and high-error regions are split or upscaled (Papantonakis et al., 2 Dec 2025). This prevents wasted parameters in smooth regions and under-sampling in complex ones.

3. Rendering Pipeline and Compositing

Rendering with textured Gaussian primitives involves several stages:

  • Intersection: For each pixel or camera ray, compute intersect parameters (u,v)(u,v) on the Gaussian support.
  • Texture Sampling: Sample per-primitive texture maps (RGBA/material charts/neural fields) at the intersection point, using bilinear filtering, adaptive warping, or neural evaluation.
  • Color/Opacity Calculation: Combine low-frequency spherical harmonics or base color with sampled offset for spatially varying radiance.
  • Compositing: Primitives are sorted front-to-back, each contributes via

C(p)=i=1Kci(p)αi(p)j<i(1αj(p))C(p) = \sum_{i=1}^{K} c_i(p)\,\alpha_i(p)\,\prod_{j<i}(1-\alpha_j(p))

This preserves the differentiable, explicit blending inherited from Gaussian splatting.

Atlas-packing and hardware sampling (e.g., via CUDA texture objects) enable efficient aggregation of many small primitive charts into a single memory-efficient block (Younes et al., 16 Jun 2025). Per-ray sorting, depth buffer handling, and frustum-based anti-aliasing (e.g. HDGS (Song et al., 2 Dec 2024)) further stabilize view-consistency and detail preservation.

4. Optimization Objectives and Loss Functions

Training involves joint optimization of geometry (μi,Σi\mu_i, \Sigma_i), appearance (texture, SH, material properties), and sampling warps via photometric, perceptual, and regularization losses. Common objective forms: L=Lrecon+αLfreq+βLsmooth+γLtexture sparsity+L = L_{\text{recon}} + \alpha L_{\text{freq}} + \beta L_{\text{smooth}} + \gamma L_{\text{texture~sparsity}} + \dots where:

  • LreconL_{\text{recon}}: photometric error (e.g., L1\mathcal{L}_1, SSIM) between rendered and ground truth images.
  • LfreqL_{\text{freq}}: penalty for mismatch between allocated texel density and estimated local frequency (Xie et al., 28 Nov 2025).
  • LsmoothL_{\text{smooth}}: regularization on deformation fields to prevent folding or excessive warp.
  • Texture sparsity encourages compact representation (Svitov et al., 13 Nov 2024), and opacity regularization eliminates unused primitives.

Adaptive optimization loops monitor error maps and perform upscaling/downscaling or geometric refinement, maintaining a closed feedback cycle for parameter allocation (Papantonakis et al., 2 Dec 2025).

5. Practical Implementations and Applications

Textured Gaussian primitives have been deployed across real-time scene rendering, view synthesis (including novel viewpoint extrapolation), volume visualization, expressive style transfer, and mesh extraction. Key approaches:

  • GStex: Per-primitive textures allow decoupling geometry and appearance, leading to compact, editable scene representations; sharp textures are preserved even with drastic reductions in primitive count (Rong et al., 19 Sep 2024).
  • SuperGaussians: Use compact spatially-varying functions (movable kernels/MLPs) for per-splat appearance, attaining +1.2dB PSNR improvement with only ~1.4x parameter overhead (Xu et al., 28 Nov 2024).
  • BillBoard Splatting: Planar primitives with explicit RGBA textures markedly compress storage and enable mesh export for full ray-tracing and rasterization effects (Svitov et al., 13 Nov 2024).
  • Fact-GS, HDGS: Adaptive sampling/warp, per-ray sorting, anti-aliasing sampling, high-frequency detail preservation (Xie et al., 28 Nov 2025, Song et al., 2 Dec 2024).
  • TexGS-VolVis: Adds explicit material, shading, and style attributes, permitting image/text-driven scene editing and partial region compositing in volume visualization (Tang et al., 18 Jul 2025).
  • TextureSplat: Embeds per-primitive material/normal charts, physically based deferred shading, and hardware-accelerated texture atlas fetch for reflective scenes (Younes et al., 16 Jun 2025).

6. Quantitative and Qualitative Performance Analysis

Textured Gaussian primitives consistently demonstrate superior performance over classical splatting approaches, both under fixed parameter budgets and in parameter-constrained regimes. Representative metrics (from (Xie et al., 28 Nov 2025, Papantonakis et al., 2 Dec 2025)):

Method Full Budget PSNR (dB) 10% Budget PSNR #Primitives Reduction LPIPS (↓)
2DGS 33.91 30.88 Baseline 0.0235
FACT-GS 34.02 31.51 ↓(same PSNR with 6× fewer params) 0.0220
Content-Aware +0.3–0.6 over alt 60–80%
BBSplat +2–4 dB over 3DGS ~30% compression up to ×17 storage reduction

High-frequency regions maintain sharp edges under extreme parameter budget reductions. Content-aware adaptive schemes route budget where needed, sustaining perceptual and measured quality (SSIM, LPIPS, Chamfer error). Alpha-only textures rival RGBA maps in some regimes, Alpha+RGB is optimal (Chao et al., 27 Nov 2024). Advanced spatial variation (e.g. SuperGaussians–MK) delivers further gains without excessive parameter overhead (Xu et al., 28 Nov 2024).

7. Future Developments and Limitations

Open avenues include:

Current constraints include storage overhead (particularly at high resolution or large primitive counts), minification artifacts in absence of mipmapping, reliance on accurate geometric initialization, and limited physically-based appearance encoding. These challenges drive ongoing methodology development in neural rendering and scene appearance modeling.


Textured Gaussian primitives thus represent a converged intersection of explicit differentiable rendering, adaptive sampling theory, and deep neural field modeling—yielding a highly expressive, efficient, and editable paradigm for photorealistic scene synthesis and visualization.

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Textured Gaussian Primitives.