Vector-Valued Distance on SPD
- The paper introduces a vector-valued distance that fully characterizes the relative position between SPD matrices via eigen-log decomposition.
- The topic is defined as the study of SPD matrices viewed as a Riemannian manifold, where the vector-valued distance encodes multi-directional geometric displacements.
- The computation utilizes eigendecomposition under an affine-invariant Riemannian metric, ensuring isometric invariance and consistency with scalar distance norms.
The vector-valued distance (VVD) on the manifold of symmetric positive definite matrices, denoted SPDₙ, provides a complete Riemannian invariant that encodes the full relative position of two points in the space. Unlike scalar Riemannian distances, the VVD delivers a higher-fidelity geometrical descriptor that supports both theoretical analysis and practical applications in manifold learning, geometric representation, and information geometry.
1. Geometric Structure of SPDₙ
SPDₙ is the set
comprising all real, symmetric, positive definite matrices. As a manifold, SPDₙ is smooth and of dimension , with tangent space at identified as , where is the vector space of real symmetric matrices.
The manifold possesses a rich structure as a Riemannian symmetric space. The general linear group acts transitively by congruence: preserving the Riemannian structure by isometries. This action partitions into “translations” by (no fixed point), “rotations” by (fixing the identity matrix ), and the involutive “reflection” .
The affine-invariant Riemannian metric is given by
for . Isometries under manifest as
2. Construction of the Vector-Valued Distance
For , the VVD produces a vector in that characterizes the displacement from to . The construction proceeds as follows:
- Compute , which belongs to and is thus diagonalisable.
- Perform an orthogonal eigendecomposition:
where , .
- The Riemannian logarithm at the identity coincides with the matrix logarithm:
- By definition (Lopez et al. §2.2), the vector-valued distance is
with ordering ensuring placement in the “positive Weyl chamber.”
Each component measures the signed expansion along the -th eigendirection, offering directional decomposition of displacement within the non-Euclidean geometry.
3. Properties of the Vector-Valued Distance
The VVD encapsulates several distinguished geometric and algebraic properties:
- Isometric Invariance: For any isometry mapping to , and share the spectrum, ensuring . Conversely, equal VVD vectors imply the pairs are isometric.
- Anti-symmetry (up to permutation): The displacement reverses sign (up to order reversal): if the eigenvalues of are , then those of are , yielding
after reordering.
- Triangle (Majorization) Property: For all ,
with the majorization order on . This property subsumes the ordinary triangle inequality under any symmetric norm.
- Complete Invariant: The VVD provides a complete isometry-invariant between pairs in ; no geometric information is lost.
4. Connection to Scalar Distance and Norms
The classical affine-invariant Riemannian distance emerges as the -norm of the VVD: More generally, all -norms of , invariant under coordinate permutations, yield Finsler metrics on with the same group of isometries. Examples include: \begin{align*} d_{F_1}(P, Q) & = \sum_{i=1}n |\log \lambda_i|, \ d_{F_\infty}(P, Q) & = \max_{i} |\log \lambda_i|. \end{align*} This structure generalizes the interpretation of distance beyond scalar length to encompass multi-directional geometric information.
5. Extension to General Symmetric Spaces
The vector-valued distance admits generalization to all noncompact Riemannian symmetric spaces of rank : defined by selecting a maximal flat through a reference point, conjugating an argument into it, and reading “log-coordinates.” SPDₙ forms a canonical instance with . Analogous constructions apply to Grassmannians, orthogonal groups, and beyond, rooting in the general Lie-theoretic setting of Weyl chambers and Cartan subalgebras (see Helgason (1978); Kapovich–Leeb–Porti (2017)).
6. Algorithmic Computation of VVD
The computation of the VVD between proceeds via the following pseudocode:
- Compute (this may be solved as ).
- Perform eigendecomposition:
- Assemble by for .
- Return .
The dominant computational cost arises from the eigendecomposition, which is . Implementation details, proofs, and further algorithmic considerations are discussed in the appendices of Lopez et al. (2021).
7. Applications and Visualization
The VVD supports a spectrum of applications in geometric representation learning. In tasks such as knowledge graph completion, item recommendation, and question answering, models utilizing SPDₙ and its VVD-based geometry demonstrate improved performance relative to analogous Euclidean and hyperbolic architectures. The vector-valued structure of the VVD also enables direct visualization of learned embeddings, revealing clear separation between positive and negative samples.
A plausible implication is enhanced interpretability: the decomposition of distances along Weyl chamber directions provides insights into how complex data relationships unfold in high-rank, non-Euclidean spaces.
For a rigorous treatment of the geometric and algebraic properties, as well as applications and proof details, see the full development and appendices in "Vector-valued Distance and Gyrocalculus on the Space of Symmetric Positive Definite Matrices" (López et al., 2021).