Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Dealing with Censored Largest Observations under Weighted Least Squares (1312.2533v2)

Published 9 Dec 2013 in stat.ME

Abstract: When observations are subject to right censoring, weighted least squares with appropriate weights (to adjust for censoring) is sometimes used for parameter estimation. With Stute's weighted least squares method, when the largest observation is censored ($Y_{(n)}+$), it is natural to apply the redistribution to the right algorithm of Efron (1967). However, Efron's redistribution algorithm can lead to bias and inefficiency in estimation. This study explains the issues clearly and proposes some alternative ways of treating $Y_{(n)}+$. The first four proposed approaches are based on the well known Buckley--James (1979) method of imputation with the Efron's tail correction and the last approach is indirectly based on a general mean imputation technique in literature. All the new schemes use penalized weighted least squares optimized by quadratic programming implemented with the accelerated failure time models. Furthermore, two novel additional imputation approaches are proposed to impute the tail tied censored observations that are often found in survival analysis with heavy censoring. Several simulation studies and real data analysis demonstrated that the proposed approaches generally outperform Efron's redistribution approach and lead to considerably smaller mean squared error and bias estimates.

Summary

We haven't generated a summary for this paper yet.