Papers
Topics
Authors
Recent
2000 character limit reached

Beyond DAGs: A Latent Partial Causal Model for Multimodal Learning

Published 9 Feb 2024 in cs.LG, cs.CV, and stat.ML | (2402.06223v2)

Abstract: Directed acyclic graphs (DAGs) are fundamental graph structures in causal modeling, but identifying the desired DAG from observational data often requires strong assumptions that may not hold in real-world scenarios, especially for latent causal models and complex multimodal data. This raises the question of whether we can relax or bypass the DAG assumption while maintaining practical utility. In this work, we propose a novel latent partial causal model for multimodal data, featuring two latent coupled variables, connected by an undirected edge, to represent the transfer of knowledge across modalities. Under specific statistical assumptions, we establish an identifiability result, demonstrating that representations learned by multimodal contrastive learning correspond to the latent coupled variables up to a trivial transformation. This result deepens our understanding of the why multimodal contrastive learning works, highlights its potential for disentanglement, and expands the utility of pre-trained models like CLIP. Synthetic experiments confirm the robustness of our findings, even when the assumptions are partially violated. Most importantly, experiments on a pre-trained CLIP model embodies disentangled representations, enabling few-shot learning and improving domain generalization across diverse real-world datasets. Together, these contributions push the boundaries of multimodal contrastive learning, both theoretically and, crucially, in practical applications.

Citations (4)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 1 like about this paper.