Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Selectivity in Probabilistic Causality: Drawing Arrows from Inputs to Stochastic Outputs (1108.3074v2)

Published 15 Aug 2011 in cs.AI, math.PR, physics.data-an, and q-bio.QM

Abstract: Given a set of several inputs into a system (e.g., independent variables characterizing stimuli) and a set of several stochastically non-independent outputs (e.g., random variables describing different aspects of responses), how can one determine, for each of the outputs, which of the inputs it is influenced by? The problem has applications ranging from modeling pairwise comparisons to reconstructing mental processing architectures to conjoint testing. A necessary and sufficient condition for a given pattern of selective influences is provided by the Joint Distribution Criterion, according to which the problem of "what influences what" is equivalent to that of the existence of a joint distribution for a certain set of random variables. For inputs and outputs with finite sets of values this criterion translates into a test of consistency of a certain system of linear equations and inequalities (Linear Feasibility Test) which can be performed by means of linear programming. The Joint Distribution Criterion also leads to a metatheoretical principle for generating a broad class of necessary conditions (tests) for diagrams of selective influences. Among them is the class of distance-type tests based on the observation that certain functionals on jointly distributed random variables satisfy triangle inequality.

Summary

We haven't generated a summary for this paper yet.