Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 183 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 28 tok/s Pro
GPT-4o 82 tok/s Pro
Kimi K2 213 tok/s Pro
GPT OSS 120B 457 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Mutual Transport Dependence: Geometric Inference

Updated 17 October 2025
  • Mutual Transport Dependence (MTD) is a geometric measure quantifying statistical dependence via optimal transport discrepancies between joint and marginal distributions.
  • MTD enables design optimization in simulation-based and implicit models by incorporating application-relevant cost functions that capture specific error metrics.
  • MTD's flexibility and bounded behavior in limiting cases make it ideal for high-dimensional and deterministic contexts, improving estimation accuracy.

Mutual Transport Dependence (MTD) is a geometric measure of statistical dependence between an unknown parameter and the outcome of an experiment, conceptualized within the framework of optimal transport theory. Unlike conventional information-theoretic measures, such as mutual information, which quantify dependence based purely on probability densities, MTD directly incorporates the geometry of the underlying spaces through sample-level cost functions. This enables experimenters and analysts to tailor their design criteria to specific estimation goals or error metrics. MTD is particularly relevant in simulation-based or implicit models, where density-based methods are limited, and provides a flexible, bounded, objective for optimizing experimental designs.

1. Formal Definition and Motivation

MTD is defined as the optimal transport discrepancy between the joint probability distribution of parameter and outcome and the product of their marginals. For a design variable dd, the MTD is given by

MTD(d)=OTc[p(θ,yd), p(θ)p(yd)]\text{MTD}(d) = \mathrm{OT}_c\left[p(\theta, y | d),\ p(\theta)p(y|d)\right]

where OTc\mathrm{OT}_c denotes the optimal transport cost induced by a cost function cc, acting over pairs (θ,y),(θ,y)(\theta, y), (\theta', y') from the sample spaces. The foundational motivation for MTD lies in overcoming the restrictive invariance properties of mutual information. MI is invariant under all injective transformations of θ\theta and yy, which is suitable for abstract information reduction but suboptimal when error is measured in a specific geometric metric (e.g., Euclidean norm). MTD’s design allows dependence to be measured in accordance with downstream estimation objectives.

2. Geometric Framework for Experimental Design

Standard optimal experimental design (OED) techniques—such as mutual information maximization—implicitly operate in the space of densities and thus are agnostic to the geometry of the sample space. In contrast, the geometric framework advanced by MTD directly exploits the values taken by data and parameters. The coupling created by design dd is quantitatively assessed by the “distance” that must be bridged to transform the joint distribution into its marginals, as measured in application-relevant metrics. The choice of cost function cc is completely flexible: for instance, Euclidean error can be encoded as c(θ,y,θ,y)=θθ2+yy2c(\theta, y, \theta', y') = |\theta-\theta'|^2 + |y-y'|^2, aligning the objective with mean squared error estimates, or customized geometric metrics to suit application-specific needs.

3. Role of Optimal Transport Theory

Optimal transport theory underpins the MTD metric by defining the minimal “work” required to reconfigure one probability measure into another, subject to a given cost function. In the MTD context, the cost function c(θ,y,θ,y)c(\theta, y, \theta', y') quantifies the effort to rearrange mass between two pairs, permitting an explicit connection with geometric features of the sample spaces. The optimal transport plan γ\gamma^*, minimizing the expected cost over all admissible couplings, encodes not just dependencies but the specific structure of parameter–outcome relationships crucial for downstream tasks. This mechanism circumvents density estimation challenges and is particularly suited to contexts with implicit likelihoods or high-dimensional sample spaces.

4. Mathematical Formulation

The foundational formula for MTD is

MTD(d)=minγΠ(p(θ,yd), p(θ)p(yd))Eγ[c(θ,y,θ,y)]\text{MTD}(d) = \min_{\gamma\in\Pi(p(\theta, y|d),\ p(\theta)p(y|d))} \mathbb{E}_{\gamma}\left[c(\theta, y, \theta', y')\right]

where Π(,)\Pi(\cdot,\cdot) is the set of all couplings with the prescribed marginals. For quadratic cost,

c(θ,y,θ,y)=θθ2+yy2.c(\theta, y, \theta', y') = |\theta - \theta'|^2 + |y - y'|^2.

MTD admits extensions: variants such as “target transport dependence” (TTD), focusing solely on the parameter space, or “expected data transport dependence” (DTD), restricted to outcome space, are formalized by weighting the cost appropriately. The relationship between MTD and mutual information is elucidated by an inequality (under log-concavity assumptions):

λMTD(d)2I(d)\lambda \cdot \text{MTD}(d) \leq 2 I(d)

for an appropriate λ\lambda, indicating that MTD is upper-bounded by a constant multiple of MI, but does not diverge as MI does in near-deterministic regimes.

5. Applications and Comparative Performance

MTD provides several practical advantages for experimental design:

  • Density-free Optimization: It bypasses explicit density evaluations, enabling its use in simulation-based inference.
  • Incorporation of Geometry: The cost function allows designs to be optimized for specific error metrics important in downstream estimation, such as RMSE in Euclidean coordinates or specialized transformed domains.
  • Stable Behavior in Limiting Cases: MTD remains bounded even as experiments approach determinism, unlike MI which can become infinite.

Case studies in the paper demonstrate superior efficiency and flexibility:

  • Source Finding Problem: MTD-based sequential designs break the symmetry imposed by MI-optimized experiments, yielding unimodal posteriors and lower RMSEs for location estimates.
  • CES Model (Behavioral Economics): By tuning the cost function in transformed coordinates to emphasize elasticity estimates, MTD-driven experimental designs yield lower RMSE on targeted model parameters compared to MI-driven methods.

6. Algorithmic and Computational Considerations

MTD is amenable to gradient-based sample-efficient optimization, leveraging sample-based plug-in estimators for both the joint and marginal distributions. This is especially beneficial when the evaluation of the likelihood is computationally expensive or the model is implicit. Closed-form solutions can be derived for simple models (e.g., linear-Gaussian), but for general cases, sample-based approximations and stochastic optimization can be employed efficiently.

7. Broader Implications and Extensions

The introduction of MTD marks a transition towards geometry-aware objectives in statistical inference and experimental design. Its flexibility in encoding domain-specific metrics empowers practitioners to construct experiments that are tightly aligned with the ultimate goal—minimizing estimation uncertainty in the precise metric relevant to their scientific or engineering task. The boundedness and adaptability of MTD suggest its applicability in stochastic, high-dimensional, or simulation-based contexts where classical density-centric methods struggle.

In summary, Mutual Transport Dependence is a rigorous, flexible metric rooted in optimal transport theory that generalizes classical dependence measures for experimental design. It offers a principled route for optimizing design strategies with direct alignment to practical performance goals, thereby contributing substantially to the toolbox of modern Bayesian and simulation-based inference.

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Mutual Transport Dependence (MTD).

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube