Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 82 tok/s
Gemini 2.5 Pro 43 tok/s Pro
GPT-5 Medium 30 tok/s
GPT-5 High 32 tok/s Pro
GPT-4o 95 tok/s
GPT OSS 120B 469 tok/s Pro
Kimi K2 212 tok/s Pro
2000 character limit reached

Empirical Laws of Interacting Systems

Updated 19 August 2025
  • Empirical laws of interacting systems are universal regularities emerging from strong correlations and complex interactions.
  • The classification using scaling exponents (c, d) distinguishes between exponential, power-law, and stretched exponential regimes in nature.
  • Generalized entropy forms, including Tsallis and two-parameter models, offer practical insights for modeling phenomena in turbulence, finance, and network dynamics.

Empirical laws of interacting systems are regularities—frequently expressed as mathematical asymptotics, scaling behavior, or distribution function patterns—that universally emerge from collective dynamics involving strong, correlated, or nontrivial interactions. These laws go beyond mean-field approximations and the frameworks developed for non-interacting or weakly interacting systems, playing a crucial role in the interpretation of macroscopic behaviors and distributions observed in nature and in complex artificial systems. Their rigorous classification, derivation, and connection to generalized entropy, scaling exponents, and universality classes provides a foundational basis for understanding a wide range of empirical distributions, especially when standard separability and additivity assumptions break down.

1. Generalized Entropy Forms and the Breakdown of Additivity

A central insight is that in strongly interacting systems, the usual assumptions underlying the Boltzmann–Gibbs (BG) entropy—specifically, separability and additivity—are not valid. The fourth Khinchin axiom (separability) is systematically violated by non-independent or correlated states, necessitating a generalization of the entropy function. The admissible class for such generalized entropic forms is

Sg[p]=i=1Wg(pi)S_g[p] = \sum_{i=1}^W g(p_i)

where gg is continuous, concave, and satisfies g(0)=0g(0) = 0. Examples include:

  • Boltzmann–Gibbs entropy: gBG(x)=xlnxg_{BG}(x) = -x\ln x, valid under additivity.
  • Tsallis entropy: gq(x)xxqg_q(x) \propto x - x^q, for $0 < q < 1$, characteristic of power-law statistics and nonadditive, but still yielding extensivity under certain correlations.

Violation of separability (Khinchin 4) allows for a richer set of entropic forms to be consistent with thermodynamic extensivity in the presence of interactions (Hanel et al., 2010).

2. Asymptotic Scaling and Classification: Scaling Exponents (c, d)

The macroscopic scaling of entropy in the thermodynamic limit (WW \to \infty) is classified by two scaling exponents, (c,d)(c,d), defined as follows:

  • The first exponent cc is determined via

f(z)=limx0+g(zx)g(x)=zcf(z) = \lim_{x \to 0^+} \frac{g(zx)}{g(x)} = z^c

with 0<c10 < c \leq 1.

c=1c = 1 recovers the BG case; c<1c < 1 signals a distinct, nonadditive asymptotic regime relevant for interacting systems.

  • The second exponent dd arises from analyzing the scaling of entropy as the number of accessible states is scaled by a factor:

hc(a)=limW{S(W1+a)S(W)}Wa(c1)=(1+a)dh_c(a) = \lim_{W \to \infty} \left\{ \frac{S(W^{1+a})}{S(W)} \right\} W^{a(c-1)} = (1 + a)^d

The pair (c,d)(c, d) uniquely determines the universality class of the system’s statistical behavior in the infinite-system limit. Equivalence classes of entropies and, correspondingly, distribution functions are thus labeled by these exponents.

Entropy Type c d Example Distribution Function
Boltzmann–Gibbs 1 1 Exponential
Tsallis (q-power law) qq 0 Power-law
Stretched exponential 1 dd Stretched exponential, d>0d>0

Known empirical distribution functions—exponential, power-law, stretched exponential—arise as asymptotic forms compatible with a given (c,d)(c, d) class; these cover virtually all widely observed tail behaviors in empirical data (Hanel et al., 2010).

3. Representative Entropy: Two-Parameter Family Sc,dS_{c,d}

A unique entropy functional, covering all equivalence classes defined by (c,d)(c, d), is given by:

Sc,d[p]i=1WΓ(1+d,1clnpi)S_{c,d}[p] \propto \sum_{i=1}^W \Gamma(1 + d, 1 - c \ln p_i)

where Γ(a,b)\Gamma(a,b) is the incomplete Gamma function. In explicit form,

Sc,d[p]=e1c+cdi=1WΓ(1+d,1clnpi)c1c+cdS_{c,d}[p] = \frac{e}{1 - c + c d} \sum_{i=1}^W \Gamma(1 + d, 1 - c \ln p_i) - \frac{c}{1 - c + c d}

This entropy has the following properties:

  • Reduces to BG entropy for (c,d)=(1,1)(c, d) = (1, 1).
  • Recovers Tsallis entropy for (c,d)=(q,0)(c, d) = (q, 0).
  • Stretched exponentials and other empirically abundant distributions are encompassed for c=1c=1, d>0d>0.
  • The associated probability distributions derived from a maximum entropy principle take the form of generalized exponentials involving the Lambert WW function.

Therefore, all physically and empirically relevant cases—covering exponential, power-law, and stretched exponential tails—are captured within this two-parameter entropy family (Hanel et al., 2010).

4. Universality and the Origin of Empirical Distribution Laws

The systematic (c,d)(c,d) classification implies that interacting systems in equilibrium with an extensive entropy can only realize a narrow set of asymptotic probability distributions. Regardless of microscopic model complexity, the following statements hold:

  • Empirical tail laws: Only exponential (c=1,d=1c=1,d=1), power-law (c<1,d=0c<1,d=0), and stretched exponential (c=1,d>0c=1,d>0) behaviors are possible as WW \to \infty if extensivity is to be preserved.
  • Universality classes: Systems are grouped into universality classes according to their scaling exponents (c,d)(c, d). Measurements of entropy scaling allow identification of the appropriate entropy/density function relevant for modeling.
  • Distribution functions: The maximum entropy distributions for each class are given by generalized exponentials, unified under the Lambert WW framework. These include all widely relevant tail behaviors found in nature.

This theoretical structure explains the empirical prevalence of exponential, power-law, and stretched exponential distributions across disciplines, ranging from turbulence and finance to biological and network systems (Hanel et al., 2010).

5. Comparison with Rényi-Type Entropies

Rényi entropy and related non-additive forms can be written as

S=G(i=1Wg(pi))S = G\left(\sum_{i=1}^W g(p_i)\right)

with, for example, G(x)=ln(x)/(1α)G(x) = \ln(x)/(1-\alpha) and g(x)=xαg(x) = x^\alpha.

  • Scaling equivalence: The asymptotic scaling reduces to the BG class (c,d)=(1,1)(c,d) = (1,1), owing to the monotonicity and asymptotic properties of GG.
  • Lesche stability: Rényi entropy is not Lesche stable, rendering its utility limited for robust characterization. In contrast, entropies of the additive form Sg=ig(pi)S_g = \sum_i g(p_i) are proven to be Lesche stable within this framework.

While Rényi-type entropies can be analyzed similarly, they are not generically representative of systems with genuine interaction-induced extensivity conditions (Hanel et al., 2010).

6. Measurement and Applications in Complex Interacting Systems

Practical implications of these results include:

  • Entropy scaling experiments: By measuring the scaling behavior of entropy with system size, empirical values for (c,d)(c, d) can be determined and used to infer the underlying universality class and correct entropy functional for the system.
  • Model selection: In systems where conventional BG statistics fails (persistent long-range correlations, anomalous diffusion, etc.), the maximum entropy principle under the generalized entropy Sc,dS_{c,d} provides the appropriate equilibrium distribution.
  • Application domains: The classification and analytical results are directly applicable to empirical phenomena in turbulence, granular flows, network dynamics, earthquake statistics, finance, and biological systems where observed distributions strongly deviate from the exponential form.

In summary, for any strongly interacting statistical system, the empirical laws (asymptotic distributions) compatible with extensive entropy and the first three Khinchin axioms must fall into a restricted set of universality classes parameterized by (c,d)(c,d). The associated entropy functionals, distribution tails, and empirical regularities are thereby derived ab initio, providing a theoretical foundation for the ubiquity and forms of observed empirical laws in interacting systems (Hanel et al., 2010).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (1)