Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
133 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bayesian sample size determination using robust commensurate priors with interpretable discrepancy weights (2401.10592v2)

Published 19 Jan 2024 in stat.ME and stat.AP

Abstract: Randomized controlled clinical trials provide the gold standard for evidence generation in relation to the efficacy of a new treatment in medical research. Relevant information from previous studies may be desirable to incorporate in the design and analysis of a new trial, with the Bayesian paradigm providing a coherent framework to formally incorporate prior knowledge. Many established methods involve the use of a discounting factor, sometimes related to a measure of similarity' between historical and the new trials. However, it is often the case that the sample size is highly nonlinear in those discounting factors. This hinders communication with subject-matter experts to elicit sensible values for borrowing strength at the trial design stage. Focusing on a commensurate predictive prior method that can incorporate historical data from multiple sources, we highlight a particular issue of nonmonotonicity and explain why this causes issues with interpretability of the discounting factors (hereafter referred to asweights'). We propose a solution for this, from which an analytical sample size formula is derived. We then propose a linearization technique such that the sample size changes uniformly over the weights. Our approach leads to interpretable weights that represent the probability that historical data are (ir)relevant to the new trial, and could therefore facilitate easier elicitation of expert opinion on their values. Keywords: Bayesian sample size determination; Commensurate priors; Historical borrowing; Prior aggregation; Uniform shrinkage.

Summary

We haven't generated a summary for this paper yet.