Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Graph Permutation Entropy: Extensions to the Continuous Case, A step towards Ordinal Deep Learning, and More (2407.07524v2)

Published 10 Jul 2024 in nlin.CD

Abstract: Nonlinear dynamics play an important role in the analysis of signals. A popular, readily interpretable nonlinear measure is Permutation Entropy. It has recently been extended for the analysis of graph signals, thus providing a framework for non-linear analysis of data sampled on irregular domains. Here, we introduce a continuous version of Permutation Entropy, extend it to the graph domain, and develop a ordinal activation function akin to the one of neural networks. This is a step towards Ordinal Deep Learning, a potentially effective and very recently posited concept. We also formally extend ordinal contrasts to the graph domain. Continuous versions of ordinal contrasts of length 3 are also introduced and their advantage is shown in experiments. We also integrate specific contrasts for the analysis of images and show that it generalizes well to the graph domain allowing a representation of images, represented as graph signals, in a plane similar to the entropy-complexity one. Applications to synthetic data, including fractal patterns and popular non-linear maps, and real-life MRI data show the validity of these novel extensions and potential benefits over the state of the art. By extending very recent concepts related to permutation entropy to the graph domain, we expect to accelerate the development of more graph-based entropy methods to enable nonlinear analysis of a broader kind of data and establishing relationships with emerging ideas in data science.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com