Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 80 tok/s
Gemini 2.5 Pro 28 tok/s Pro
GPT-5 Medium 32 tok/s Pro
GPT-5 High 38 tok/s Pro
GPT-4o 125 tok/s Pro
Kimi K2 181 tok/s Pro
GPT OSS 120B 462 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Likelihood Inference for Possibly Non-Stationary Processes via Adaptive Overdifferencing (2011.04168v4)

Published 9 Nov 2020 in stat.ME

Abstract: We make an observation that facilitates exact likelihood-based inference for the parameters of the popular ARFIMA model without requiring stationarity by allowing the upper bound $\bar{d}$ for the memory parameter $d$ to exceed $0.5$: estimating the parameters of a single non-stationary ARFIMA model is equivalent to estimating the parameters of a sequence of stationary ARFIMA models. This allows for the use of existing methods for evaluating the likelihood for an invertible and stationary ARFIMA model. This enables improved inference because many standard methods perform poorly when estimates are close to the boundary of the parameter space. It also allows us to leverage the wealth of likelihood approximations that have been introduced for estimating the parameters of a stationary process. We explore how estimation of the memory parameter $d$ depends on the upper bound $\bar{d}$ and introduce adaptive procedures for choosing $\bar{d}$. We show via simulation how our adaptive procedures estimate the memory parameter well, relative to existing alternatives, when the true value is as large as 2.5.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.