Evaluating and Utilizing Surrogate Outcomes in Covariate-Adjusted Response-Adaptive Designs (2408.02667v3)
Abstract: Surrogate outcomes have long been studied as substitutes for long-term primary outcomes. However, current surrogate evaluation methods do not directly account for their benefits in updating treatment randomization probabilities in adaptive experiments that aim to learn and respond to treatment effect heterogeneity. In this context, surrogate outcomes can expedite updates to randomization probabilities and thus improve expected outcomes of newly-enrolled participants by enabling earlier detection of heterogeneous treatment effects. We introduce a novel approach for evaluating candidate surrogate outcomes that quantifies both of these benefits in sequential adaptive experiments. We also propose a new Covariate-Adjusted Response-Adaptive design that uses an Online-Superlearner to evaluate and adaptively select surrogate outcomes for updating treatment randomization probabilities during the trial. We further introduce a Targeted Maximum Likelihood Estimation method that addresses dependence in adaptively collected data and achieves asymptotic normality without parametric assumptions. Our design and estimation methods show robust performance in simulations, including those using real trial data. Overall, this framework not only provides a comprehensive way to quantify benefits and select among candidate surrogate outcomes, but also offers a general tool for evaluating various adaptive designs with inferences, providing insights into opportunities and costs of alternative designs that could have been implemented.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.