Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 189 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 35 tok/s Pro
GPT-5 High 40 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 207 tok/s Pro
GPT OSS 120B 451 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Statistical Inference Based on a New Weighted Likelihood Approach (1610.07949v5)

Published 25 Oct 2016 in stat.ME

Abstract: We discuss a new weighted likelihood method for parametric estimation. The method is motivated by the need for generating a simple estimation strategy which provides a robust solution that is simultaneously fully efficient when the model is correctly specified. This is achieved by appropriately weighting the score function at each observation in the maximum likelihood score equation. The weight function determines the compatibility of each observation with the model in relation to the remaining observations and applies a downweighting only if it is necessary, rather than automatically downweighting a proportion of the observations all the time. This allows the estimators to retain full asymptotic efficiency at the model. We establish all the theoretical properties of the proposed estimators and substantiate the theory developed through simulation and real data examples. Our approach provides an alternative to the weighted likelihood method of Markatou et al. (1997, 1998).

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.