Dice Question Streamline Icon: https://streamlinehq.com

Consistency of sample-size–dependent priors under the strong likelihood principle

Determine whether subjective Bayesian analysis consistent with the strong likelihood principle can coherently justify prior distributions that vary with the sample size n in ultra-high-dimensional settings where model complexity grows with n; specifically, establish conditions or frameworks under which changing prior uncertainty in response to increased data volume does not violate the strong likelihood principle, or demonstrate impossibility if such justification cannot be made.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper highlights the strong likelihood principle, which implies Bayesian procedures should depend only on the prior and the likelihood of the actually observed data, not on unobserved data or design specifics. From this, the authors argue that the prior should not depend on the sample size.

However, in contemporary ultra-high-dimensional problems, researchers often fit progressively more complex models as data volume increases, implicitly tying model complexity to n. The authors question how a subjective Bayesian can consistently change their prior uncertainty merely because more data were collected, while still adhering to the strong likelihood principle.

References

It is not clear how it can be argued consistently that since you gave me one million observations and not just the thousand I hoped to get, I changed my uncertainties about the model parameters.

From Thomas Bayes to Big Data: On the feasibility of being a subjective Bayesian (2508.01642 - Ritov, 3 Aug 2025) in Section 4.2, “The ‘prior’ should depend on the sample size n”