Papers
Topics
Authors
Recent
2000 character limit reached

Total Normal Curvature Regularization

Updated 29 December 2025
  • Total normal curvature regularization is defined by integrating directional normal curvatures to preserve sharp geometric features like edges and corners.
  • It employs variational formulations and operator-splitting schemes that successfully smooth surfaces and images while maintaining key structures.
  • Applications range from triangulated surface smoothing and graph-based denoising to curvature penalties in deep generative models and neural network training.

Total normal curvature regularization is a class of variational methods that penalize the integrated absolute normal curvature over a domain, surface, or data manifold. It generalizes scalar curvature-based regularizers and provides a directionally isotropic means of promoting geometric fidelity—preserving sharp features such as edges and corners—across applications ranging from geometric inverse problems and surface processing to deep generative modeling and neural networks. The regularizer is typically implemented either via direct integration of pointwise normal curvatures from multiple directions or as an aggregate of higher-order differential operators that encode extrinsic, intrinsic, or data-induced curvature.

1. Mathematical Definition and Geometric Foundations

Total normal curvature (TNC) regularization is defined by integrating a local measure of normal curvature over all tangent directions at each point in the domain. For a graph surface v:ΩR2Rv:\Omega\subset\mathbb{R}^2\to\mathbb{R}, the directional normal curvature at (x,y)(x,y) in direction θ\theta is

κn(x,y;θ)=tTH(v)t1+v2[1+(v:t)2],\kappa_n(x,y;\theta) = \frac{\mathbf{t}^T\mathbf{H}(v)\mathbf{t}}{\sqrt{1+|\nabla v|^2}\left[1+(\nabla v:\mathbf{t})^2\right]},

where t(θ)=(cosθ,sinθ)T\mathbf{t}(\theta) = (\cos\theta,\sin\theta)^T and H(v)\mathbf{H}(v) is the Hessian. The total normal curvature at (x,y)(x,y) is then

κT(x,y)=02πκn(x,y;θ)dθ.\kappa_T(x,y) = \int_0^{2\pi} |\kappa_n(x,y;\theta)|\, d\theta.

A typical variational model for denoising or surface smoothing is

E(v)=α2Ω02πtTH(v)t1+(v:t)2dθdx+βΩvdx+γ2Ωfv2dx,E(v) = \frac{\alpha}{2} \int_\Omega\int_0^{2\pi} \frac{\left|\mathbf{t}^T \mathbf{H}(v) \mathbf{t}\right|}{1 + (\nabla v : \mathbf{t})^2} d\theta\, dx + \beta\int_\Omega |\nabla v|\, dx + \frac{\gamma}{2}\int_\Omega |f-v|^2 dx,

where α,β,γ\alpha,\beta,\gamma are weights for TNC, TV, and fidelity terms, respectively (Lu et al., 22 Dec 2025).

For triangulated surfaces Γh\Gamma_h in R3\mathbb{R}^3, the discrete total normal variation is

TVh(n)=EEEarccos(n1,En2,E),TV_h(n) = \sum_{E\in \mathcal{E}} |E|\, \arccos(n_{1,E} \cdot n_{2,E}),

where E|E| is the edge length and n1,E,n2,En_{1,E}, n_{2,E} are unit normals of adjacent faces. This exactly recovers the classical discrete total mean curvature (Bergmann et al., 2019).

In generative modeling, the extrinsic curvature regularizer is constructed in terms of the generalized Gauss map T(z)T(z) and its differential, leading to a coordinate-invariant curvature density integrated over the sampling distribution in latent space (Lee et al., 2023).

2. Variational and Algorithmic Formulations

TNC regularization leads to variational PDEs of higher order. The constrained minimization in (Lu et al., 22 Dec 2025) is reformulated as a steady-state (gradient flow) system: {ηpt+DqJ1(p,H)+qJ2(p)+DqJ3(p)+0, Ht+GJ1(p,H)+0,\begin{cases} \eta\frac{\partial \mathbf{p}}{\partial t} + D_{\mathbf{q}}J_1(\mathbf{p},\mathbf{H}) + \partial_{\mathbf{q}}J_2(\mathbf{p}) + D_{\mathbf{q}}J_3(\mathbf{p}) + \cdots \ni 0, \ \frac{\partial \mathbf{H}}{\partial t} + \partial_{\mathbf{G}}J_1(\mathbf{p},\mathbf{H}) + \cdots \ni 0, \end{cases} where all terms and constraint indicators are as above.

Time integration is conducted via operator splitting:

  • Step 1: TNC subproblem (fixed-point/ADMM iterations for p,H\mathbf{p}, \mathbf{H})
  • Step 2: TV shrinkage
  • Step 3: Enforce H=p\mathbf{H} = \nabla \mathbf{p}
  • Step 4: Data fidelity projection

These steps are efficiently solvable using closed-form updates, fixed-point iterations, ADMM cycles, and FFT-based Poisson solvers for each subproblem. Directional integration is implemented with NN-angle quadrature, typically N=8N=8 (Lu et al., 22 Dec 2025).

For triangulated surfaces, a Riemannian split Bregman (manifold ADMM) scheme alternates between updating vertex positions, jump-vectors (via vectorial shrinkage), and Lagrange multipliers, with log-maps and parallel transport on S2S^2 (Bergmann et al., 2019). In deep learning, the regularizer is implemented as a penalty on input derivatives up to order NN, and directional derivative estimation strategies using backpropagation are used (Poschl, 3 Nov 2025).

3. Discrete, Graph, and Manifold Implementations

TNC models admit several forms of discretization depending on the data representation:

  • Triangulated surfaces: Edgewise sums using geodesic distances between normals precisely recover classical discrete total mean curvature. The model is scaling-invariant and insensitive to the presence of flat facets, favoring piecewise-flat reconstructions (Bergmann et al., 2019).
  • Image domains: Discrete sampling of directional curvatures with up to NN directions gives isotropic edge-aware smoothing.
  • Graph-based data: In the CURE model, the penalty is given as the squared L2L^2-norm of graph Laplacians—mimicking (ΔMu)2(\Delta_M u)^2 over a finite point cloud, enforcing smoothness beyond mere manifold dimension constraints (Dong et al., 2019).
  • Neural networks: The curvature rate λ\lambda is estimated by fitting the exponential growth rate of higher-order derivative norms logDnf\log\|D^n f\| vs.\ nn, and the corresponding CRR loss penalizes small nn derivatives via efficient directional estimates (Poschl, 3 Nov 2025).

The following table summarizes primary discretization strategies:

Setting Discrete TNC Functional Key Features
Triangulated surface EEarccos(n1,En2,E)\sum_E |E| \arccos(n_{1,E}\cdot n_{2,E}) Exact discrete mean curvature (Bergmann et al., 2019)
Images, graphs NN-direction sum/integral of κn(x,y;θ)|\kappa_n(x,y;\theta)| Multi-direction, isotropic (Lu et al., 22 Dec 2025)
Point clouds Pu22+λGLu22\|\nabla_P u\|_2^2 + \lambda \|GL\,u\|_2^2 Graph Laplacian, higher-smoothness (Dong et al., 2019)
Neural networks n=2NEx[Dnf(x)2]\sum_{n=2}^N \mathbb{E}_x\Big[\|D^n f(x)\|^2\Big] Input-space higher derivatives (Poschl, 3 Nov 2025)

4. Comparative Properties and Relation to Other Curvature Models

TNC regularization is distinct from and complementary to scalar curvature-based models, including Euler’s elastica (squared curvature penalty), mean curvature flow, and Gaussian curvature TV:

  • Edge and corner preservation: Integrating over directions increases isotropy and enables preservation of sharp geometric features. Unlike area-based or mean curvature squared priors, TNC regularization does not penalize flat faces or sharp corners, producing reconstructions with well-defined edges (Bergmann et al., 2019, Lu et al., 22 Dec 2025).
  • Relation to total mean curvature: On triangulated surfaces, the discrete TNC coincides exactly with the classical discrete total mean curvature, establishing a bridge to well-studied discrete geometry (Bergmann et al., 2019).
  • Fine control of smoothness: Higher-order graph Laplacian or biharmonic penalties (as in CURE) enforce smoothness via the L2L^2-norm of curvature, making TNC applicable to point cloud and manifold denoising (Dong et al., 2019).
  • Input-space sharpness in learning: In deep networks, penalizing higher-order input derivatives directly shapes functional smoothness, reducing spurious sharpness and improving calibration (Poschl, 3 Nov 2025).

5. Applications and Empirical Performance

TNC regularization has demonstrated efficacy in a wide variety of applications:

  • Surface and image smoothing: The operator-splitting algorithm in (Lu et al., 22 Dec 2025) achieves strong performance on synthetic and real datasets, reducing staircasing and preserving corners with competitive or superior PSNR/SSIM compared to Euler’s elastica, mean curvature, and other total curvature TV methods.
  • Mesh denoising and inverse inclusion detection: In geometric inversion tasks, TNC preserves polyhedral shapes under noise and enables accurate detection of inclusions in PDE-constrained reconstruction (Bergmann et al., 2019).
  • Manifold learning and missing data: The CURE model, combining dimension and curvature regularization, demonstrates improved performance over low-dimension-only methods for image inpainting and semi-supervised learning (Dong et al., 2019).
  • Deep generative models: Extrinsic and intrinsic curvature penalties in autoencoders produce flatter learned manifolds and lower reconstruction errors, with TNC-based (extrinsic) regularization achieving substantial gains on motion-capture data (Lee et al., 2023).
  • Neural network generalization: Curvature rate regularization attenuates overfitting sharpness, maintains test accuracy, and improves expected confidence error, outperforming first-order methods and competing with SAM on MNIST and synthetic data (Poschl, 3 Nov 2025).

Selected quantitative results from (Lu et al., 22 Dec 2025):

Task Method 1\ell_1 error PSNR SSIM
Surface (“Square”) TNC 49.86
EE 69.32
Image (“Zelda”) TNC 34.76 0.8936
EE/MC/TAC Lower Lower Lower

6. Connections, Extensions, and Open Directions

TNC regularization interfaces with a broad spectrum of geometric and learning-theoretic methods:

  • Discrete differential geometry: TNC coinciding with total mean curvature on triangulations enables seamless integration with classical shape analysis (Bergmann et al., 2019).
  • Convex relaxations and variational lifting: The total roto-translational variation (in lifted orientation space) provides convex relaxations for curvature energies, with applications to image segmentation and inpainting (Chambolle et al., 2017).
  • Manifold biharmonic equations: The CURE framework in data science translates TNC as manifold biharmonic regularization, computable over discretized point clouds (Dong et al., 2019).
  • Learning and generalization: Curvature rate regularization (CRR) formalizes the connection between higher-order input geometry and learnability, offering parameterization-invariant sharpness control (Poschl, 3 Nov 2025).

Open questions include: characterization of stationary shapes under TNC, convergence analysis of Riemannian ADMM and splitting methods, automated model-weight selection, extension to variable mesh topologies and higher-order discretizations, and new inverse problems (e.g., crack detection, surface segmentation, joint shape-topology optimization) (Bergmann et al., 2019, Lu et al., 22 Dec 2025). Mechanistic understanding and optimal tuning strategies for TNC-type regularization in neural network training remain active areas of investigation.

7. References

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Total Normal Curvature Regularization.