Universality classes of chaos in non Markovian dynamics (2512.22445v1)
Abstract: Classical chaos theory rests on the notion of universality, whereby disparate dynamical systems share identical scaling laws. Existing universality classes, however, implicitly assume Markovian dynamics. Here, a logistic map endowed with power law memory is used to show that Feigenbaum universality breaks down when temporal correlations decay sufficiently slowly. A critical memory exponent is identified that separates perturbative and memory dominated regimes, demonstrating that long range memory acts as a relevant renormalisation operator and generates a new universality class of chaotic dynamics. The onset of chaos is accompanied by fractional scaling of Lyapunov exponents, in quantitative agreement with analytical predictions. These results establish temporal correlations as a previously unexplored axis of universality in chaotic systems, with implications for physical, biological and geophysical settings where memory effects are intrinsic.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Explain it Like I'm 14
Universality classes of chaos in non-Markovian dynamics — explained simply
Overview: What is this paper about?
This paper asks a big question about chaos: do the famous “universal rules” of how chaos begins still hold when a system remembers its past? In many real things (like living cells, the climate, or gooey materials), what happens next depends not just on now, but also on what happened a while ago. The paper shows that when this “memory” lasts a long time, the usual universal rules of chaos break down and new kinds of universal behavior appear.
What questions does the paper ask?
The authors focus on a few simple, clear questions:
- How does adding memory to a simple chaotic system change when and how chaos starts?
- Is there a sharp line between “short memory” (where the usual rules still work) and “long memory” (where new rules take over)?
- Do the classic signatures of chaos (like Feigenbaum’s pattern of period-doubling and the square‑root scaling of the Lyapunov exponent) survive with memory, or are there new patterns and new exponents?
How did they study it?
They used a well-known simple model of chaos and gave it memory.
The model (think: a recipe with a twist)
- Start with the logistic map, a simple rule that updates a number between 0 and 1:
- Without memory: next = r × now × (1 − now). This tiny rule can create very complex behavior.
- Add memory: the new “next” also includes a weighted average of many past values. The weights shrink like 1/kα (read “one over k to the alpha”), where k is how many steps back you look.
- If α > 1 (short memory), the past fades quickly.
- If α ≤ 1 (long memory), the past fades very slowly—so even old events keep mattering.
- ε controls how strong the memory is.
A helpful picture: imagine shouting in a canyon. If the echo fades fast (α > 1), only your latest shout matters. If the echo fades slowly (α ≤ 1), old shouts keep bouncing back and change what you hear now.
What they measured (in everyday terms)
- Stability: If you slightly nudge the system, does it calm down or spiral away? They analyzed this using math that tracks how tiny differences grow.
- Lyapunov exponent: This number tells you how fast two almost-identical starting points drift apart—like drawing two pencil lines very close together and seeing how quickly they separate. Bigger means more chaotic.
- Bifurcation patterns: They looked for the classic “period-doubling cascade” (the Feigenbaum route to chaos), where stable behavior splits into 2, then 4, then 8, and so on, before turning chaotic.
They combined:
- Analytical work (equations that predict stability and scaling).
- Computer experiments (plots of behavior and Lyapunov exponents) to confirm the theory.
What did they find, and why does it matter?
Here are the main takeaways:
- There is a sharp boundary at α = 1:
- For α > 1 (short memory), the usual universal behavior survives. The classic period‑doubling route to chaos still happens. The main effect of memory is to shift where (in r) the transitions occur. The Lyapunov exponent near the onset still grows like a square root: λ ∝ (r − r_c){1/2}.
- For α = 1 (borderline case), even tiny memory destabilizes steady behavior in a delicate, logarithmic way. This is the tipping point between the two worlds.
- For α < 1 (long memory), the rules change. The system no longer follows the classic Feigenbaum pattern. Chaos can appear earlier and in different ways (even coming and going as you change r). Most importantly, the Lyapunov exponent grows with a new, “fractional” power:
- λ ∝ (r − r_c){β(α)} with β(α) = 1 / (2 − α)
- This is different from the usual 1/2 and depends smoothly on how long the memory lasts.
Why this is important:
- It shows that “memory” is a new axis for classifying chaotic systems—just like how different spatial dimensions change the rules in physics. In short: how long a system remembers its past can define a new “universality class” of chaos.
- It explains why many real systems with history (biology, climate, viscoelastic flows, etc.) may not follow the textbook route to chaos.
What could this change or help in the future?
- Better models: Many real systems have memory. This work gives a simple, clear framework for including memory and predicting when chaos starts or stops.
- New ways to diagnose memory: By measuring how the Lyapunov exponent rises near the onset of chaos, you can estimate how strong and long the system’s memory is—without directly observing the memory.
- Broader theory: It points to building a new kind of “renormalization” theory (a way to group systems by their deep similarities) that includes time memory, not just present state.
- Open questions: How does this extend to more complicated systems (many variables, continuous time)? What happens to other chaos indicators (like full Lyapunov spectra or entropy) under long memory?
In a sentence: The paper shows that when the past doesn’t fade quickly, chaos plays by new rules—and those rules form new universality classes governed by how strongly and how slowly the past fades.
Knowledge Gaps
Knowledge gaps, limitations, and open questions
Below is a single, consolidated list of what remains missing, uncertain, or unexplored in the paper, articulated as concrete, actionable items for future research.
- Rigorous universality beyond the logistic map: The paper’s claims are demonstrated only for a logistic map with an additive power-law memory term. It remains unverified whether the identified universality classes and critical exponents hold across other smooth unimodal maps, piecewise-smooth maps, or maps with different nonlinearities (e.g., tent, quadratic-cubic families).
- Memory-kernel generality: Only a power-law kernel is analyzed. The universality class dependence on kernel shape (exponential/stretched-exponential kernels, finite-support kernels, oscillatory kernels, kernels with sign changes) and normalization remains unexplored.
- Sign and structure of memory feedback: The analysis assumes small, positive memory strength (). Effects of negative (inhibitory memory), state-dependent , multiplicative memory, or memory applied to the nonlinear term rather than are not treated.
- Finite-memory truncation effects: Numerical results rely on a finite memory cutoff (e.g., 2000 past steps). The bias introduced by truncation, convergence behavior with increasing cutoff, and error bounds between truncated and infinite-memory dynamics are not quantified.
- Domain invariance and boundedness: With the additive memory term, iterates may leave . The conditions under which the state remains bounded (or returns to the invariant interval) and the impact of boundary-handling (clipping, reflecting) on universality are not analyzed.
- Full Lyapunov spectrum in infinite-dimensional state space: The paper uses a single (finite-time) Lyapunov exponent based on , but a non-Markovian map is effectively infinite-dimensional. A principled computation of the full Lyapunov spectrum (e.g., via tangent dynamics in an augmented state space) and its scaling laws with memory is missing.
- Universality constants in the summable-memory regime: While classical period-doubling behavior is said to “persist” for , the paper does not provide quantitative tests of Feigenbaum constants (, ) as functions of , nor error bars demonstrating their robustness under summable memory.
- Characterization of “memory-dominated” universality constants: In the long-memory regime (), there is no Feigenbaum cascade; yet no analogous constants or scaling descriptors (e.g., parameter-space accumulation rates, scaling of laminar phases, crisis thresholds) are defined or measured.
- Stability analysis beyond fixed points: The linear stability analysis focuses on fixed points. Stability and bifurcation structure for periodic orbits (period-2, 4, …) and their accumulation properties under memory are not derived, limiting the universality claims about route-to-chaos modifications.
- Error-controlled asymptotics of the polylogarithm: The fractional-scaling derivation leverages near . Rigorous remainder bounds, validity ranges, and branch-cut issues for the polylogarithm in the relevant parameter regimes are not provided.
- Non-perturbative behavior for at larger : The paper treats the regime perturbatively (). Whether large memory strength () can induce qualitatively new behavior (e.g., breakdown of the Feigenbaum cascade) is not investigated.
- Route-to-chaos diversity: Beyond period-doubling, other classical routes (intermittency, quasiperiodicity, crises) under power-law memory are mentioned qualitatively but not systematically classified or quantified (e.g., scaling of laminar lengths, universality of Pomeau–Manneville types under memory).
- Mechanism and generality of re-entrant stabilization: The “re-entrant” suppression of chaos at large driving under long memory is shown numerically but not mechanistically explained or tested for robustness across , map families, and kernels.
- Continuous-time analogs: The extension to continuous-time systems (fractional differential equations, delay differential equations with distributed kernels) and whether the critical line at persists is unaddressed.
- Functional renormalization-group (RG) framework: The claim that long-range memory acts as a “relevant renormalization operator” lacks a constructed RG transformation in functional phase space; fixed points, eigenoperators, and scaling dimensions for memory kernels are not developed.
- Effective temporal dimensionality: The analogy of as a “temporal dimension” is heuristic. Formalizing this notion (e.g., via scaling relations, hyperscaling analogs, and universality-class taxonomy tied to ) remains an open theoretical task.
- Entropy production and invariant measures: How memory alters metric invariants (Kolmogorov–Sinai entropy, Pesin identities), invariant measures, and attractor geometry (fractal dimensions) is not analyzed or measured.
- Finite-time scaling and fitting protocols: The extraction of from numerics lacks detailed methodology (fitting windows, detrending, statistical uncertainties, finite-time corrections), making reproducibility and error assessment difficult.
- Initial-condition sensitivity and basins: The dependence of outcomes (e.g., onset of chaos, re-entrance) on initial conditions and the structure of basins of attraction under memory are not mapped.
- Robustness to noise and stochasticity: A systematic study of temporally correlated noise (colored noise, fractional Gaussian noise) and its interplay with deterministic memory in shaping universality classes is not provided.
- Parameter-identification from data: Practical procedures for inferring and from empirical time series (using fractional Lyapunov scaling or other markers) and their identifiability limits are not developed.
- Calibration to real systems: While applications are suggested (neural, viscoelastic, ecological, etc.), no concrete mapping from physical parameters to , nor validation with experimental/observational datasets, is presented.
- Generalized memory structures: The analysis assumes linear superposition of past states. Nonlinear memory (e.g., kernel acting on or on functions of the state history), state-dependent kernels, or coupling between memory and control parameters () are not explored.
- Mathematical well-posedness: Conditions for existence, uniqueness, and boundedness of solutions under various and (including global attractor existence) are not established for the infinite-history map.
- Computational methodology transparency: Numerical implementation details (time stepping, handling of boundary violations, acceleration of long-memory convolution, code availability) are insufficient for independent verification and scaling studies.
Glossary
- Analytic continuation: Extension of a function beyond its original domain using complex analysis. "with arising from the analytic continuation of ."
- Asymptotic expansion: An approximation of a function in a limit (e.g., small parameter) via a series capturing leading behavior. "has the singular asymptotic expansion near :"
- Bifurcation diagram: Plot showing qualitative changes in a system’s long-term behavior as a parameter varies. "Top row: bifurcation diagram;"
- Characteristic equation: The equation whose roots determine stability or growth rates of linearized dynamics. "This is the characteristic equation, the central analytical object of the paper."
- Correlation time: The timescale over which perturbations remain correlated; the inverse of the slowest growth rate. "leading to a divergence of the correlation time ."
- Critical dimension: A parameter value at which qualitative changes in universality or stability occur. " is identified as a critical dimension in time."
- Critical exponent: The exponent describing how a quantity (e.g., Lyapunov exponent) scales near a critical point. "(b) Critical exponent~ vs. memory exponent~."
- Critical scaling: Power-law dependence of observables near a transition. "Thus, the Lyapunov exponent exhibits non-classical critical scaling,"
- Delayâcoupled oscillators: Oscillatory systems whose interactions include explicit time delays. "from viscoelastic and glassy materials to biological feedback networks, delayâcoupled oscillators and nonâMarkovian quantum environments"
- Feigenbaum constants: Universal numerical constants characterizing period-doubling routes to chaos. "with Feigenbaum constants and "
- Feigenbaum universality: Universal scaling behavior in period-doubling cascades across many maps. "Feigenbaum universality breaks down when temporal correlations decay sufficiently slowly."
- Finite-time Lyapunov exponent: Time-averaged measure of local exponential divergence over a finite horizon. "The finite-time Lyapunov exponent measures the average logarithmic expansion rate:"
- Fractional scaling: Non-integer power-law scaling of observables with control parameters. "The onset of chaos is accompanied by fractional scaling of Lyapunov exponents,"
- Infinite-dimensional state: State description requiring infinitely many variables due to memory or nonlocality. "requiring an infinite-dimensional state for complete description."
- Intermittency: Dynamical behavior with alternating laminar (regular) phases and chaotic bursts. "such as the onset of chaos in the logistic map via intermittency â the Lyapunov exponent scales as"
- Lyapunov exponent: Measure of the average exponential rate of divergence of nearby trajectories. "Lyapunov exponent~."
- Marginal stability: Boundary condition where growth rates are near zero and stability is undecided. "with and near marginal stability,"
- Markovian framework: Assumption that future states depend only on the present state, not the past. "placing them within a Markovian framework."
- Mellin transform: Integral transform used to analyze series and asymptotics, connecting sums to power laws. "by matching the Mellin transform of the geometric series,"
- Memory exponent: Parameter controlling the decay rate of memory kernel weights (power-law exponent). "A critical memory exponent is identified that separates perturbative and memoryâdominated regimes,"
- NonâMarkovian dynamics: Dynamics where future evolution depends on an extended history. "Universality classes of chaos in non-Markovian dynamics"
- Nonlocal multiplier: A term in the linearized dynamics that depends on past states and acts multiplicatively. "with the memory term acting as a nonlocal multiplier,"
- Periodâdoubling cascade: Sequence of bifurcations where periodic orbits double their period, leading to chaos. "Classical universality (period-doubling cascade),"
- Phase-space structure: Geometric organization of trajectories and invariant sets in state space. "characterising the geometric accumulation of bifurcation points and the scaling and slope of the phase-space structure,"
- Pitchfork bifurcation: Symmetry-breaking bifurcation where a single fixed point splits into two. "near a pitchfork bifurcation the relaxation rate scales linearly"
- Polylogarithm: Special function appearing in memory-driven characteristic equations. "Defining the polylogarithm"
- Power-law memory: Memory kernel with weights decaying as a power of time lag. "We demonstrate, using a power-law memory kernel, that classical universality survives only when temporal correlations are summable."
- Renormalisation group fixed point: Scale-invariant point governing universal behavior across systems. "The universal scaling laws are governed by a finite-dimensional renormalisation group fixed point,"
- Relevant renormalisation operator: Perturbation that grows under renormalisation and changes universality. "long-range memory acts as a relevant renormalisation operator"
- Reâentrant stabilization: Return to stability at larger driving after chaos onset. "characterized by early onset, fractional critical scaling, and reâentrant stabilization."
- Riemann zeta function: Special function governing summability of memory. "where is the Riemann zeta function,"
- Saddle-node bifurcation: Bifurcation where two fixed points collide and annihilate; also called tangent bifurcation. "In contrast, near a tangent (saddle-node) bifurcation â such as the onset of chaos in the logistic map via intermittency â"
- Summable memory: Memory with convergent total weight (), yielding perturbative effects. "Summable memory preserves classical universality,"
- Temporal correlations: Statistical dependence across time that shapes dynamical behavior. "temporal correlations act as a previously unexplored axis of universality in chaotic systems,"
- Temporal nonlocality: Dependence of current dynamics on the entire past history. "introducing temporal nonlocality whereby future evolution correlates with the distant past,"
- Transcendental equation: Equation involving non-algebraic functions (e.g., logs) with solutions not expressible in finite radicals. "a transcendental equation for~ in terms of~."
- Universality class: Group of systems sharing the same scaling laws and critical behavior. "generates a new universality class of chaotic dynamics."
- Unimodal maps: One-dimensional maps with a single peak, central to Feigenbaum universality. "This universality extends to a broad class of smooth, unimodal maps"
- Viscoelastic: Materials exhibiting both viscous and elastic response, leading to long-lived memory effects. "from viscoelastic and glassy materials to biological feedback networks,"
Practical Applications
Immediate Applications
Below is a concise set of actionable uses that can be deployed now, derived from the paper’s findings on memory-driven universality in chaotic dynamics.
- Memory-aware parameter calibration for short-memory systems (α > 1)
- Sector: industrial control, robotics, epidemiology, education
- What: Use the paper’s result that summable memory renormalizes critical points without changing classical universality. Replace Markovian models with “effective” parameterizations (e.g., shift r → r_c(α) using ε ζ(α)) to match observed stability/instability boundaries.
- Tools/workflows: “Parameter renormalizer” module that estimates ε, α and adjusts control/fit parameters; quick bifurcation-scan scripts for calibration.
- Assumptions/dependencies: Memory kernel is summable; measured dynamics near stationary regimes; model structure close to logistic-like single-hump nonlinearity.
- Diagnostic for hidden temporal correlations via Lyapunov scaling
- Sector: healthcare (ICU physiology), finance (volatility), climate, neuroscience
- What: Estimate finite-time Lyapunov exponents versus control/driving and fit β; deviations from β = 1/2 indicate non-Markovian long memory (α ≤ 1). Use this to detect memory-dominated regimes that alter risk/instability forecasts.
- Tools/workflows: “Lyapunov-scaling estimator” (time-series library plugin) that performs parameter sweep, computes λN, and fits β to infer α via β = 1/(2 − α).
- Assumptions/dependencies: Sufficient data length, quasi-stationarity, manageable noise; ability to scan or proxy a control parameter; reliable phase-space reconstruction for empirical data.
- Controller design checklist to avoid memory-induced destabilization
- Sector: robotics, process control, algorithmic trading
- What: Translate the α ≤ 1 destabilization result into practical rules: limit integral/averaging depth (effective ε) and ensure feedback memories are summable. Flag designs that approximate power-law tails (moving-average windows that effectively grow, cascaded integrators).
- Tools/workflows: Control design guideline sheets; linter that inspects controllers for long-memory patterns.
- Assumptions/dependencies: Approximate mapping from implementation (integral/filters) to effective α, ε; linearization validity near operating points.
- R&D tuning in viscoelastic and microfluidic systems to suppress chaos
- Sector: materials, microfluidics, chemical engineering
- What: Exploit observed re-entrant stabilization and altered chaos onset by adjusting polymer relaxation times/concentration to effectively tune memory strength, suppressing fluctuations at high driving.
- Tools/workflows: Experimental protocols linking polymer rheology to effective ε, α; stability scanning around suspected critical points.
- Assumptions/dependencies: Clear mapping from physical relaxation spectra to power-law-like kernels; minimal confounds from geometry and noise.
- Minimal benchmark for ML forecasting and system identification
- Sector: software/ML, academia
- What: Use the non-Markovian logistic map as a standardized testbed to evaluate whether sequence models (RNNs, Transformers, SSMs) can learn memory-dominated chaos and recover fractional scaling.
- Tools/workflows: Open-source implementation of Eq. (1), data generators with tunable ε, α; evaluation suite comparing learned β to ground-truth β(α).
- Assumptions/dependencies: Faithful training protocols; sufficient sequence length; careful separation of interpolation vs extrapolation tests.
- Experimental workflow to infer memory class in lab oscillators
- Sector: physics/biophysics labs
- What: Protocol: (1) sweep driving/control, (2) compute λN locally, (3) fit β; if β ≠ 1/2, infer α and classify regime (summable vs non-summable memory).
- Tools/workflows: Jupyter notebooks; bifurcation and λN computation scripts; data QC steps to handle transients and noise.
- Assumptions/dependencies: Ability to vary control parameters and gather long time series; minimal stochastic forcing.
- Education and communication
- Sector: education
- What: Teaching modules demonstrating the breakdown of Feigenbaum universality under power-law memory; interactive visualizations of phase diagrams and Lyapunov scaling.
- Tools/workflows: Simulation notebooks; classroom-ready plots and exercises.
- Assumptions/dependencies: None significant; uses provided minimal model.
Long-Term Applications
The following applications require further validation, scaling, or development before widespread deployment.
- Memory-engineered anti-chaos controllers (fractional-order control)
- Sector: robotics, industrial automation
- What: Design controllers with tailored fractional memory (PIλDμ) to either keep α > 1 (preserve classical universality and predictable scaling) or deliberately induce memory-dominated regimes to suppress chaos.
- Tools/products: “Fractional Anti-Chaos Controller” toolchain integrated with MATLAB/Simulink/Julia.
- Assumptions/dependencies: Robust identification of effective α, ε from plant; fractional calculus hardware/software support; guarantees in high-dimensional settings.
- Early-warning indicators for systemic risk via universality-class shifts
- Sector: finance, power grids, climate policy
- What: Monitor β(α) and r_c(α) drift as indicators that systems are entering memory-dominated regimes where classical stability heuristics fail. Use as part of stress-testing dashboards.
- Tools/workflows: Real-time λN estimators, streaming fits of β; alarms when β deviates from 1/2.
- Assumptions/dependencies: High-quality, high-frequency data; robust real-time Lyapunov estimation under noise; validated mapping from observed β to actionable risk thresholds.
- Synthetic biology circuits with programmable distributed memory
- Sector: biotechnology/synthetic biology
- What: Engineer genetic feedback with distributed delays (e.g., via cascaded sequestration/degradation) to realize effective power-law memory, steering oscillations away from chaotic regimes.
- Tools/products: Design frameworks linking circuit topology to α; libraries of delay motifs.
- Assumptions/dependencies: Biochemical realization of approximate power-law kernels; control over burden/noise; in vivo validation.
- Non-Markovian reservoir engineering for quantum devices
- Sector: quantum technology
- What: Tailor environmental spectral densities to control memory and avoid (or harness) chaotic dynamics in open quantum systems; use β(α) as a diagnostic in semiclassical limits.
- Tools/products: Reservoir engineering protocols; simulation suites with Li_α-based characteristic equations.
- Assumptions/dependencies: Reliable mapping from physical baths to effective α; quantum-to-classical correspondence for diagnostics.
- Climate and geophysical models with memory-aware parameterizations
- Sector: climate science, hydrology, Earth system policy
- What: Incorporate non-Markovian parameterizations (e.g., land–ocean heat uptake, soil moisture memory) that can shift chaotic variability patterns; interpret observed deviations from classical scaling as signatures of long memory.
- Tools/workflows: Subgrid schemes with fractional kernels; detection modules for β in paleoclimate/modern records.
- Assumptions/dependencies: Computational tractability; adequate data for calibration; multi-scale coupling validation.
- Materials and glasses: diagnostic instrumentation add-ons
- Sector: rheology/materials
- What: Develop rheometer modes that probe near-instability dynamics, extract β to quantify memory-dominated regimes in glassy/viscoelastic systems; use to tune processing parameters.
- Tools/products: “β-meter” software/hardware add-on for commercial rheometers.
- Assumptions/dependencies: Transferability from discrete-map predictions to continuous-time flows; careful experimental protocols.
- Epidemiological planning with fractional-order models
- Sector: public health policy
- What: Use memory-aware (fractional) epidemic models to capture behavior-driven and immunity-memory effects; monitor β-like indicators to anticipate non-classical outbreak dynamics.
- Tools/workflows: Model-fitting pipelines with power-law memory; scenario analyses for intervention timing.
- Assumptions/dependencies: Strong datasets on behavioral feedback and immunity waning; rigorous validation against historical outbreaks.
- Autonomy co-design for latency and history effects
- Sector: automotive, UAVs, industrial robotics
- What: Quantify and cap effective memory exponents induced by sensor fusion and delayed actuation to maintain predictable stability margins; design co-optimization of hardware latency and software filtering.
- Tools/products: “Memory budget” analyzers in autonomy stacks.
- Assumptions/dependencies: Accurate end-to-end latency/memory characterization; alignment with safety standards.
- General-purpose non-Markovian bifurcation and system ID suite
- Sector: software/tools for science and engineering
- What: A software platform for bifurcation analysis with power-law memory kernels (discrete and continuous-time), including Lyapunov spectrum computation and β(α) estimation.
- Tools/products: Libraries for Python/Julia/Matlab; plugins for DynSys toolboxes.
- Assumptions/dependencies: Efficient algorithms for high-dimensional and infinite-memory approximations; user adoption and benchmarking.
- Standards and reporting guidelines for “memory exponent” in dynamical systems
- Sector: industry consortia, regulatory bodies
- What: Establish reporting of estimated α and β(α) alongside classical stability metrics for systems with feedback/delays (e.g., medical devices, grid components).
- Tools/workflows: Best-practice documents; certification checklists.
- Assumptions/dependencies: Cross-domain consensus; evidence base from case studies.
Notes on feasibility across items:
- The core dependency is the ability to estimate α (or β) reliably from data, which requires sufficiently long, relatively stationary time series and careful handling of noise and control-parameter scans.
- The paper’s results stem from a minimal 1D map with additive power-law memory; real systems may require model extensions (multivariate, continuous time, stochasticity, non-additive memory). Validation and adaptation are needed before policy-critical use.
Collections
Sign up for free to add this paper to one or more collections.