- The paper introduces a two-pass, gradient-based sensitivity filter that eliminates artifacts in 3D Gaussian Splatting for views with out-of-distribution camera poses.
- It computes per-ray sensitivity scores using a rotation-aligned gradient formulation to identify and suppress unstable primitives during rendering.
- The method achieves superior perceptual quality with lower error metrics, enabling efficient real-time synthesis in challenging, unconstrained environments.
Sensitivity-Based Filtering for Robust 3D Gaussian Splatting under Out-of-Distribution Camera Poses
Introduction
This paper addresses a critical limitation in 3D Gaussian Splatting (3DGS) for novel view synthesis: the emergence of severe rendering artifacts when camera poses deviate significantly from the training distribution. Such artifacts—floating primitives, inconsistent geometry, and view-dependent noise—arise due to insufficient multi-view constraints during training, resulting in poorly-constrained Gaussian primitives. Existing solutions, such as retraining with additional regularization or post-hoc uncertainty quantification, are computationally expensive and impractical for real-time deployment. The authors propose a real-time, render-aware filtering method that operates during rendering, leveraging gradient-based sensitivity analysis to identify and suppress unstable 3D primitives, particularly those exhibiting anisotropic orientation-induced instabilities.
Technical Contributions
The core technical innovation is a two-pass, gradient-based sensitivity filter that operates at render time. The method computes per-ray, per-Gaussian sensitivity scores by analyzing the gradient of the rendered pixel color with respect to spatial perturbations of the Gaussian primitive. This analysis is performed in a rotation-aligned coordinate system, isolating directional instabilities due to anisotropic orientations. The filter consists of:
- Gradient Sensitivity Computation: For each ray-Gaussian intersection, the method calculates the gradient of the composite color with respect to the 3D position, focusing on the transmittance dynamics and decoupling from color-specific variation. The gradient is evaluated in the Gaussian's rotation-aligned space, emphasizing orientation effects.
- Two-Pass Filtering: In the first pass, intersections are accepted or rejected based on a sensitivity threshold. In the second pass, Gaussians with a high rejection ratio (aggregate sensitivity) are excluded from rendering for the current viewpoint.
- Ray-Marching Integration: The filter is implemented within a ray-marching pipeline, enabling precise control over ray-Gaussian interactions and facilitating efficient, render-time sensitivity analysis.
This approach directly targets the core source of generative uncertainty—directional instability from anisotropic Gaussian primitives—without requiring retraining or modification of the underlying 3DGS reconstruction pipeline.
Methodological Details
3DGS Representation and Rendering
3DGS models radiance fields as a set of explicit 3D Gaussian primitives, each parameterized by color, opacity, mean, and covariance. The covariance is decomposed into scale and rotation matrices, representing the Gaussian as an ellipsoid. Rendering involves depth-sorting and alpha compositing of projected Gaussians onto the image plane.
Sensitivity Analysis
The sensitivity score for each Gaussian is derived from the gradient of the composite color with respect to spatial perturbations, expressed as:
∇C(x)=k=1∑Kckakj=1∏k−1(1−aj)(j=1∑k−11−ajajΣj−1xj−Σk−1xk)
To focus on structural sensitivity, the color vector is replaced with a scalar, and the gradient is computed in the rotation-aligned space:
S=k=1∑Kakj=1∏k−1(1−aj)(j=1∑k−11−ajajxj−xk)
This formulation isolates rotational sensitivity, highlighting Gaussians prone to view-dependent artifacts due to poor orientation.
Filtering Pipeline
- First Pass: For each ray-Gaussian intersection, compute sensitivity and accept/reject based on a threshold.
- Second Pass: For each Gaussian, compute the rejection ratio; exclude Gaussians exceeding a user-defined threshold from rendering.
This pipeline preserves detail-carrying, stable Gaussians while suppressing unstable, artifact-inducing primitives.
Experimental Results
The method is evaluated on Deep Blending and NeRF On-the-go datasets, using a modified Nerfstudio Splatfacto pipeline with ray-marching and the proposed filter. Perceptual quality is assessed using NR-IQA metrics (NIQE, BRISQUE, PIQE), with lower scores indicating better quality. Across all scenes and metrics, the proposed filter achieves the lowest scores, outperforming BayesRays—a NeRF-based uncertainty filtering baseline. The filter effectively suppresses anisotropy-induced artifacts in extreme OOD views without over-smoothing, maintaining high visual fidelity and geometric consistency.
Ablation Studies
- Single-Pass Filtering: Filtering only at the intersection level fails to remove artifact-prone Gaussians near their centers due to low gradient magnitudes.
- Scale-Incorporated Gradients: Including scale in the gradient calculation normalizes anisotropic Gaussians, leading to loss of detail and over-filtering of small, detail-critical primitives.
- Parameter Sensitivity: Optimal performance requires scene-specific tuning of sensitivity and rejection thresholds; using only one parameter yields suboptimal perceptual quality.
Theoretical and Practical Implications
The proposed sensitivity filter provides a fast, render-time proxy for epistemic uncertainty, analogous to the Fisher Information Matrix but computed per-intersection in the rotation-aligned space. This enables real-time suppression of artifacts without retraining, making the approach suitable for interactive applications and deployment scenarios with unpredictable camera trajectories. The method bridges a gap in uncertainty quantification for explicit volumetric representations, addressing directional instabilities that are inadequately handled by isotropic or redundancy-based filters.
Future Directions
Potential future developments include:
- Adaptive Thresholding: Automated, data-driven selection of sensitivity and rejection thresholds to optimize perceptual quality across diverse scenes.
- Integration with Active View Selection: Leveraging sensitivity scores for dynamic view planning and data acquisition in 3D reconstruction.
- Extension to Other Explicit Representations: Adapting the filter to other explicit volumetric or point-based rendering techniques.
- Hardware Acceleration: Optimizing the filter for GPU-based real-time rendering pipelines.
Conclusion
This work introduces a real-time, gradient-based sensitivity filter for 3DGS that robustly suppresses anisotropic, orientation-induced artifacts in novel view synthesis from out-of-distribution camera poses. The method operates entirely at render time, requiring no retraining or modification of the reconstruction pipeline, and achieves superior perceptual quality compared to existing NeRF-based uncertainty filters. The approach advances the practical deployment of 3DGS in interactive and unconstrained viewing scenarios, with implications for uncertainty quantification and robust scene reconstruction in computer vision and graphics.