Normalized Path Signature (LNPS)
- LNPS is a mathematical descriptor that normalizes iterated-integral path signatures by powers of the path length, ensuring compact and invariant representations.
- It employs a truncated signature expansion with dynamic programming to efficiently compute features on sliding windows, making real-time processing feasible.
- LNPS has been applied in online biometric signature verification using RNNs, achieving state-of-the-art error rates by capturing essential geometric and asymptotic properties.
The length-normalized path signature (LNPS) is a mathematical descriptor for continuous paths of bounded variation, designed to achieve compactness, scale-invariance, and (via suitable linear combinations) rotation-invariance. LNPS is constructed by normalizing the coefficients of the truncated iterated-integral (path signature) expansion by powers of the path length. Originally proposed and validated in the context of online dynamic pen-trajectories for biometric signature verification, LNPS builds on foundational results in the theory of path signatures and has well-understood geometric and asymptotic properties (Lai et al., 2017, Boedihardjo et al., 2020).
1. Truncated Path Signature and Length-Normalization
Let be a continuous path of bounded variation, or its discrete (piecewise-linear) sampling. The level- iterated integral of is
with . The truncated path signature up to level is
where spans all multi-indices of order . In practice, the th level is interpreted as a -dimensional tensor. Discrete approximations sum over products of local path increments:
with .
Define the total path length
or discretely as .
The length-normalized path signature up to order is then
with each level- entry divided by (Lai et al., 2017).
2. Invariance Properties
Scale Invariance
If for some , then
so and . Thus,
demonstrating strict scale-invariance at every order (Lai et al., 2017).
Rotation Invariance
For , . Any contraction of with an -invariant tensor yields a rotation-invariant scalar. In 2D, at , the antisymmetric form
gives the signed area enclosed by the path (by Green's theorem), invariant under . Higher-order invariants can be constructed via representation theory of (Lai et al., 2017).
3. Asymptotic and Geometric Characterization
The normalized signature is central to the isometry conjecture: for a path of finite length , the normalized th-level signature satisfies
where is the projective tensor norm.
The conjecture (proved in (Boedihardjo et al., 2020) for planar paths with tree-reduced property or local angle-boundedness) is
for all tree-reduced paths. This demonstrates that the normalized signature encodes global, geometric information about the underlying curve, supplying a natural feature for identity and invariance.
4. Practical Computation and Complexity
LNPS is computed on sliding windows of size across sequences of length . At each window centered at , a local subpath is extracted, iterated sums up to level are computed, and normalization by the local is performed at every order. Each LNPS channel is -normalized over the sequence.
Naively, computation of all at level in one window would require operations, but dynamic programming (Chen's identity) reduces this to . Overall complexity per window is , leading to total cost . For standard settings (, or $3$, –$13$), real-time computation for data sampled at 100 Hz is achieved (Lai et al., 2017).
5. Application to Online Signature Verification with Neural Sequence Models
LNPS vectors are used as input features to an RNN for biometric verification. The data pipeline is as follows:
- Input: , with being the -window centered at .
- Architecture: a two-layer Gated Recurrent Unit (GRU) with 128 units per layer, followed by a 64-dimensional fully connected output.
- Training: employs a triplet loss (pushing genuine-positive pairs closer than forgeries by a margin ) and a center loss (clustering per-client signatures) with regularization. The final loss:
with , .
Triplet loss:
Center loss:
where is the RNN embedding, denote anchor, positive, and forgery, and is the center for client (Lai et al., 2017).
6. Empirical Results and Comparative Performance
LNPS, when used alone in a DTW-based verification framework, demonstrated progressive gains at higher signature levels:
- Level 1 (first order): ≈8.1–8.7% EER
- Level 2: ≈5.4–6.6% EER
- Level 4: 4.98% EER (best)
- Using rotation-invariant features up to level 4: 5.26% EER
Integration of LNPS with the RNN system (when trained jointly on SVC-2004 and MCYT-100) using templates, window , achieved a 2.37% EER (state of the art at publication). Ablation eliminating LNPS to use only features resulted in substantially degraded EER (9.0%). Excluding MCYT-100 from training increased EER to 3.58%; adding more clients improved performance, converging to 2.37% (Lai et al., 2017).
7. Role in Path Geometry and Fundamental Results
LNPS captures essential geometric aspects of trajectories beyond the biometric context. As established in the theoretical analysis of (Boedihardjo et al., 2020), the correct asymptotic normalization of the path signature retrieves the path length under mild geometric (tree-reduced or local angle-constrained) conditions. The framework utilizes a Cartan development into , decoupling the signature asymptotics into radial and angular ODEs, with length recovery intimately linked to the angular evolution of the path. These properties affirm the foundational role of LNPS in geometric data analysis, stochastic processes, and sequential learning.