Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Semiparametric inference for the scale-mixture of normal partial linear regression model with censored data (2011.07559v1)

Published 15 Nov 2020 in stat.ME, math.ST, stat.CO, stat.OT, and stat.TH

Abstract: In the framework of censored data modeling, the classical linear regression model that assumes normally distributed random errors has received increasing attention in recent years, mainly for mathematical and computational convenience. However, practical studies have often criticized this linear regression model due to its sensitivity to departure from the normality and from the partial nonlinearity. This paper proposes to solve these potential issues simultaneously in the context of the partial linear regression model by assuming that the random errors follow a scale-mixture of normal (SMN) family of distributions. The proposed method allows us to model data with great flexibility, accommodating heavy tails, and outliers. By implementing the B-spline function and using the convenient hierarchical representation of the SMN distributions, a computationally analytical EM-type algorithm is developed to perform maximum likelihood inference of the model parameters. Various simulation studies are conducted to investigate the finite sample properties as well as the robustness of the model in dealing with the heavy-tails distributed datasets. Real-word data examples are finally analyzed for illustrating the usefulness of the proposed methodology.

Summary

We haven't generated a summary for this paper yet.