Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Inference of multivariate exponential Hawkes processes with inhibition and application to neuronal activity (2205.04107v4)

Published 9 May 2022 in stat.ME

Abstract: The multivariate Hawkes process is a past-dependent point process used to model the relationship of event occurrences between different phenomena.Although the Hawkes process was originally introduced to describe excitation effects, which means that one event increases the chances of another occurring, there has been a growing interest in modelling the opposite effect, known as inhibition.In this paper, we focus on how to infer the parameters of a multidimensional exponential Hawkes process with both excitation and inhibition effects. Our first result is to prove the identifiability of this model under a few sufficient assumptions. Then we propose a maximum likelihood approach to estimate the interaction functions, which is, to the best of our knowledge, the first exact inference procedure in the frequentist framework.Our method includes a variable selection step in order to recover the support of interactions and therefore to infer the connectivity graph.A benefit of our method is to provide an explicit computation of the log-likelihood, which enables in addition to perform a goodness-of-fit test for assessing the quality of estimations.We compare our method to standard approaches, which were developed in the linear framework and are not specifically designed for handling inhibiting effects.We show that the proposed estimator performs better on synthetic data than alternative approaches. We also illustrate the application of our procedure to a neuronal activity dataset, which highlights the presence of both exciting and inhibiting effects between neurons.

Summary

We haven't generated a summary for this paper yet.