Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Is Bayes Posterior just Quick and Dirty Confidence? (1112.5582v1)

Published 23 Dec 2011 in stat.ME

Abstract: Bayes [Philos. Trans. R. Soc. Lond. 53 (1763) 370--418; 54 296--325] introduced the observed likelihood function to statistical inference and provided a weight function to calibrate the parameter; he also introduced a confidence distribution on the parameter space but did not provide present justifications. Of course the names likelihood and confidence did not appear until much later: Fisher [Philos. Trans. R. Soc. Lond. Ser. A Math. Phys. Eng. Sci. 222 (1922) 309--368] for likelihood and Neyman [Philos. Trans. R. Soc. Lond. Ser. A Math. Phys. Eng. Sci. 237 (1937) 333--380] for confidence. Lindley [J. Roy. Statist. Soc. Ser. B 20 (1958) 102--107] showed that the Bayes and the confidence results were different when the model was not location. This paper examines the occurrence of true statements from the Bayes approach and from the confidence approach, and shows that the proportion of true statements in the Bayes case depends critically on the presence of linearity in the model; and with departure from this linearity the Bayes approach can be a poor approximation and be seriously misleading. Bayesian integration of weighted likelihood thus provides a first-order linear approximation to confidence, but without linearity can give substantially incorrect results.

Citations (130)

Summary

Overview of "Is Bayes Posterior just Quick and Dirty Confidence?"

The paper by D. A. S. Fraser titled "Is Bayes Posterior just Quick and Dirty Confidence?" critically examines the Bayesian approach to statistical inference, particularly focusing on its application of the conditional probability formula to derive posterior distributions. The central theme of the paper is the comparison between the Bayesian posterior and the concept of confidence distributions, as originally introduced by Fisher and further developed by Neyman. The discourse is primarily motivated by the observation that Bayesian methods can yield misleading results in non-linear parameter settings.

Key Points and Arguments

  1. Bayesian and Confidence Approaches: The paper highlights that the Bayesian approach leverages the conditional probability formula to transform a prior distribution along with observed data into a posterior distribution. This contrasts with the frequentist confidence approach, which utilizes the distribution function of the observed data to make inferences.
  2. Linearity and Model Assumptions: It is noted that Bayesian methods and frequentist confidence are fundamentally aligned only when the model under consideration is a location model, which exhibits linear parameters. In such cases, the Bayesian posterior distribution coincides with the confidence distribution.
  3. Critique of Default Priors: The use of default or non-informative priors in Bayesian analysis, while popular for their mathematical simplicity, is criticized for not providing reliable probabilistic statements about parameters unless the structural assumptions such as linearity hold.
  4. Evaluating Posterior Distributions: Through the concept of Neyman diagrams and actual proportions of true assertions (quantiles), the paper suggests that Bayesian methods can deviate significantly from their claimed performance when applied to nonlinear models.
  5. Significant Examples: Fraser presents several examples showing how departures from linearity—through bounded parameter models, parameter curvature, and model curvature—can result in Bayesian posteriors that do not align with frequentist confidence properties or are simply not reliable.
  6. Implications and Conclusions: The paper argues that Bayesian analysis using arbitrary priors might misrepresent probability claims and should be cautiously interpreted, particularly in contexts where model assumptions are violated. In practice, combining Bayesian posteriors with external information or subjective priors should be critically evaluated against confidence properties.

Implications for Future Research

The implications for future developments in statistical inference are substantial. Researchers are encouraged to consider the conditions under which Bayesian approaches are valid, particularly the importance of model structure and parameter linearity. As the paper highlights the limitations of default priors, further exploration into domain-specific informative priors or hybrid methods that synchronize Bayesian and frequentist principles may offer enhanced interpretability and robustness.

Concluding Remarks

In concluding, Fraser’s paper serves as a cautionary note against the uncritical application of Bayesian methods where the assumptions required for valid probabilistic inference are unmet. It underscores the need for a rigorous exploration of the foundational assumptions in statistical inference and advocates for a responsible and informed use of Bayesian statistics in practice.

X Twitter Logo Streamline Icon: https://streamlinehq.com