Total Normal Curvature Regularization
- Total normal curvature regularization is defined by integrating directional normal curvatures to preserve sharp geometric features like edges and corners.
- It employs variational formulations and operator-splitting schemes that successfully smooth surfaces and images while maintaining key structures.
- Applications range from triangulated surface smoothing and graph-based denoising to curvature penalties in deep generative models and neural network training.
Total normal curvature regularization is a class of variational methods that penalize the integrated absolute normal curvature over a domain, surface, or data manifold. It generalizes scalar curvature-based regularizers and provides a directionally isotropic means of promoting geometric fidelity—preserving sharp features such as edges and corners—across applications ranging from geometric inverse problems and surface processing to deep generative modeling and neural networks. The regularizer is typically implemented either via direct integration of pointwise normal curvatures from multiple directions or as an aggregate of higher-order differential operators that encode extrinsic, intrinsic, or data-induced curvature.
1. Mathematical Definition and Geometric Foundations
Total normal curvature (TNC) regularization is defined by integrating a local measure of normal curvature over all tangent directions at each point in the domain. For a graph surface , the directional normal curvature at in direction is
where and is the Hessian. The total normal curvature at is then
A typical variational model for denoising or surface smoothing is
where are weights for TNC, TV, and fidelity terms, respectively (Lu et al., 22 Dec 2025).
For triangulated surfaces in , the discrete total normal variation is
where is the edge length and are unit normals of adjacent faces. This exactly recovers the classical discrete total mean curvature (Bergmann et al., 2019).
In generative modeling, the extrinsic curvature regularizer is constructed in terms of the generalized Gauss map and its differential, leading to a coordinate-invariant curvature density integrated over the sampling distribution in latent space (Lee et al., 2023).
2. Variational and Algorithmic Formulations
TNC regularization leads to variational PDEs of higher order. The constrained minimization in (Lu et al., 22 Dec 2025) is reformulated as a steady-state (gradient flow) system: where all terms and constraint indicators are as above.
Time integration is conducted via operator splitting:
- Step 1: TNC subproblem (fixed-point/ADMM iterations for )
- Step 2: TV shrinkage
- Step 3: Enforce
- Step 4: Data fidelity projection
These steps are efficiently solvable using closed-form updates, fixed-point iterations, ADMM cycles, and FFT-based Poisson solvers for each subproblem. Directional integration is implemented with -angle quadrature, typically (Lu et al., 22 Dec 2025).
For triangulated surfaces, a Riemannian split Bregman (manifold ADMM) scheme alternates between updating vertex positions, jump-vectors (via vectorial shrinkage), and Lagrange multipliers, with log-maps and parallel transport on (Bergmann et al., 2019). In deep learning, the regularizer is implemented as a penalty on input derivatives up to order , and directional derivative estimation strategies using backpropagation are used (Poschl, 3 Nov 2025).
3. Discrete, Graph, and Manifold Implementations
TNC models admit several forms of discretization depending on the data representation:
- Triangulated surfaces: Edgewise sums using geodesic distances between normals precisely recover classical discrete total mean curvature. The model is scaling-invariant and insensitive to the presence of flat facets, favoring piecewise-flat reconstructions (Bergmann et al., 2019).
- Image domains: Discrete sampling of directional curvatures with up to directions gives isotropic edge-aware smoothing.
- Graph-based data: In the CURE model, the penalty is given as the squared -norm of graph Laplacians—mimicking over a finite point cloud, enforcing smoothness beyond mere manifold dimension constraints (Dong et al., 2019).
- Neural networks: The curvature rate is estimated by fitting the exponential growth rate of higher-order derivative norms vs.\ , and the corresponding CRR loss penalizes small derivatives via efficient directional estimates (Poschl, 3 Nov 2025).
The following table summarizes primary discretization strategies:
| Setting | Discrete TNC Functional | Key Features |
|---|---|---|
| Triangulated surface | Exact discrete mean curvature (Bergmann et al., 2019) | |
| Images, graphs | -direction sum/integral of | Multi-direction, isotropic (Lu et al., 22 Dec 2025) |
| Point clouds | Graph Laplacian, higher-smoothness (Dong et al., 2019) | |
| Neural networks | Input-space higher derivatives (Poschl, 3 Nov 2025) |
4. Comparative Properties and Relation to Other Curvature Models
TNC regularization is distinct from and complementary to scalar curvature-based models, including Euler’s elastica (squared curvature penalty), mean curvature flow, and Gaussian curvature TV:
- Edge and corner preservation: Integrating over directions increases isotropy and enables preservation of sharp geometric features. Unlike area-based or mean curvature squared priors, TNC regularization does not penalize flat faces or sharp corners, producing reconstructions with well-defined edges (Bergmann et al., 2019, Lu et al., 22 Dec 2025).
- Relation to total mean curvature: On triangulated surfaces, the discrete TNC coincides exactly with the classical discrete total mean curvature, establishing a bridge to well-studied discrete geometry (Bergmann et al., 2019).
- Fine control of smoothness: Higher-order graph Laplacian or biharmonic penalties (as in CURE) enforce smoothness via the -norm of curvature, making TNC applicable to point cloud and manifold denoising (Dong et al., 2019).
- Input-space sharpness in learning: In deep networks, penalizing higher-order input derivatives directly shapes functional smoothness, reducing spurious sharpness and improving calibration (Poschl, 3 Nov 2025).
5. Applications and Empirical Performance
TNC regularization has demonstrated efficacy in a wide variety of applications:
- Surface and image smoothing: The operator-splitting algorithm in (Lu et al., 22 Dec 2025) achieves strong performance on synthetic and real datasets, reducing staircasing and preserving corners with competitive or superior PSNR/SSIM compared to Euler’s elastica, mean curvature, and other total curvature TV methods.
- Mesh denoising and inverse inclusion detection: In geometric inversion tasks, TNC preserves polyhedral shapes under noise and enables accurate detection of inclusions in PDE-constrained reconstruction (Bergmann et al., 2019).
- Manifold learning and missing data: The CURE model, combining dimension and curvature regularization, demonstrates improved performance over low-dimension-only methods for image inpainting and semi-supervised learning (Dong et al., 2019).
- Deep generative models: Extrinsic and intrinsic curvature penalties in autoencoders produce flatter learned manifolds and lower reconstruction errors, with TNC-based (extrinsic) regularization achieving substantial gains on motion-capture data (Lee et al., 2023).
- Neural network generalization: Curvature rate regularization attenuates overfitting sharpness, maintains test accuracy, and improves expected confidence error, outperforming first-order methods and competing with SAM on MNIST and synthetic data (Poschl, 3 Nov 2025).
Selected quantitative results from (Lu et al., 22 Dec 2025):
| Task | Method | error | PSNR | SSIM |
|---|---|---|---|---|
| Surface (“Square”) | TNC | 49.86 | — | — |
| EE | 69.32 | — | — | |
| Image (“Zelda”) | TNC | — | 34.76 | 0.8936 |
| EE/MC/TAC | Lower | Lower | Lower |
6. Connections, Extensions, and Open Directions
TNC regularization interfaces with a broad spectrum of geometric and learning-theoretic methods:
- Discrete differential geometry: TNC coinciding with total mean curvature on triangulations enables seamless integration with classical shape analysis (Bergmann et al., 2019).
- Convex relaxations and variational lifting: The total roto-translational variation (in lifted orientation space) provides convex relaxations for curvature energies, with applications to image segmentation and inpainting (Chambolle et al., 2017).
- Manifold biharmonic equations: The CURE framework in data science translates TNC as manifold biharmonic regularization, computable over discretized point clouds (Dong et al., 2019).
- Learning and generalization: Curvature rate regularization (CRR) formalizes the connection between higher-order input geometry and learnability, offering parameterization-invariant sharpness control (Poschl, 3 Nov 2025).
Open questions include: characterization of stationary shapes under TNC, convergence analysis of Riemannian ADMM and splitting methods, automated model-weight selection, extension to variable mesh topologies and higher-order discretizations, and new inverse problems (e.g., crack detection, surface segmentation, joint shape-topology optimization) (Bergmann et al., 2019, Lu et al., 22 Dec 2025). Mechanistic understanding and optimal tuning strategies for TNC-type regularization in neural network training remain active areas of investigation.
7. References
- Discrete total normal variation and manifold-valued TV: (Bergmann et al., 2019)
- Variational integration and operator splitting for TNC: (Lu et al., 22 Dec 2025)
- Convex relaxation in roto-translation and curvature energies: (Chambolle et al., 2017)
- Curvature-based deep generative model regularization: (Lee et al., 2023)
- Curvature rate regularization and sharpness in neural networks: (Poschl, 3 Nov 2025)
- Biharmonic graph/multimanifold TNC (CURE) in imaging: (Dong et al., 2019)