Papers
Topics
Authors
Recent
2000 character limit reached

Bayesian influence diagnostics using normalizing functional Bregman divergence (1904.03717v1)

Published 7 Apr 2019 in stat.AP

Abstract: Ideally, any statistical inference should be robust to local influences. Although there are simple ways to check about leverage points in independent and linear problems, more complex models require more sophisticated methods. Kullback-Leiber and Bregman divergences were already applied in Bayesian inference to measure the isolated impact of each observation in a model. We extend these ideas to models for dependent data and with non-normal probability distributions such as time series, spatial models and generalized linear models. We also propose a strategy to rescale the functional Bregman divergence to lie in the (0,1) interval thus facilitating interpretation and comparison. This is accomplished with a minimal computational effort and maintaining all theoretical properties. For computational efficiency, we take advantage of Hamiltonian Monte Carlo methods to draw samples from the posterior distribution of model parameters. The resulting Markov chains are then directly connected with Bregman calculus, which results in fast computation. We check the propositions in both simulated and empirical studies.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.