Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
GPT-5.1
GPT-5.1 96 tok/s
Gemini 3.0 Pro 48 tok/s Pro
Gemini 2.5 Flash 155 tok/s Pro
Kimi K2 197 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Complex-Valued Activation Functions

Updated 18 November 2025
  • Complex-valued activation functions (CVAFs) extend real-valued activations to the complex domain, balancing analyticity, boundedness, and phase dynamics.
  • They are categorized into holomorphic, split (Cartesian), and magnitude–phase families, each offering unique trade-offs between numerical stability and phase preservation.
  • CVAFs empower complex-valued neural networks in applications like signal processing, communications, medical imaging, and quantum information by enhancing model expressivity and performance.

Complex-valued activation functions (CVAFs) generalize the fundamental concept of nonlinearities from real-valued neural networks to architectures operating on complex numbers. They are central to the design and performance of complex-valued neural networks (CVNNs), which arise naturally in domains such as signal processing, communications, medical imaging, and quantum information, where the input, weights, or target functions are intrinsically complex. The extension from real to complex-valued activations presents unique mathematical and engineering challenges, particularly regarding analyticity, boundedness, and phase handling, compelling a distinct taxonomy of CVAFs and rigorous investigation of their theoretical properties and practical efficacy.

1. Mathematical Foundations and Taxonomy

The principal distinction in designing CVAFs is the interplay between analyticity (holomorphicity), boundedness, and the behavior with respect to phase and magnitude. Liouville’s theorem prohibits any nonconstant, bounded, entire (holomorphic) nonlinearity on C\mathbb{C}, necessitating trade-offs for nontrivial activation design (Abdalla, 2023, Hammad, 27 Jul 2024). This has led to three major classes of CVAFs:

  1. Holomorphic (Fully Complex) Activations: Functions analytic in zz, such as f(z)=tanh(z)f(z) = \tanh(z), f(z)=1/(1+ez)f(z) = 1/(1+e^{-z}), polynomials, exponentials, and other entire transcendental functions. They admit gradients f/z\partial f/\partial z without Wirtinger cross-terms, facilitating analytic backpropagation. However, they are necessarily unbounded or feature poles/branch cuts, complicating numerical stability (Corte et al., 2014, Abdalla, 2023, Voigtlaender, 2020, Hammad, 27 Jul 2024).
  2. Split Real–Imaginary Activations (Cartesian or Type-A): Extend real nonlinearities separately to (z)\Re(z) and (z)\Im(z), e.g.,

f(z)=φ(z)+iφ(z)f(z) = \varphi(\Re z) + i \varphi(\Im z)

for some real activation φ\varphi. These are bounded (if φ\varphi is) and directly compatible with real-valued frameworks but fail Cauchy–Riemann conditions, breaking analyticity (Abdalla, 2023, Bassey et al., 2021, Hammad, 27 Jul 2024, Barrachina et al., 2023). They are simple to implement and robust, but phase information is not preserved, and coupling between real and imaginary components is neglected.

  1. Magnitude–Phase (Polar or Type-B) Activations: Functions depending on the modulus z|z| and potentially modulating the phase, often of the form

f(z)=ρ(z)eiϕ(z)f(z) = \rho(|z|) e^{i\phi(z)}

with ρ,ϕ\rho, \phi real functions. This includes modReLU, cardioid, amplitude–phase (AP) families, and related constructs. Phase can be preserved, allowing for strong inductive biases in wave and frequency-domain problems, at the cost of analyticity (Abdalla, 2023, Virtue et al., 2017, Bassey et al., 2021, Hammad, 27 Jul 2024).

The following table organizes representative CVAFs by family, illustrating key distinguishing features:

Family Example Formula Analyticity
Holomorphic f(z)=tanh(z)f(z)=\tanh(z), f(z)=1/(1+ez)f(z)=1/(1+e^{-z}) holomorphic
Split (Cartesian) f(z)=φ(z)+iφ(z)f(z)=\varphi(\Re z) + i \varphi(\Im z) nonholomorphic
Magnitude–Phase f(z)=ReLU(z+b)z/zf(z)=\text{ReLU}(|z|+b)z/|z|, f(z)=c(z,argz)f(z)=c(|z|, \arg z) nonholomorphic

2. Classical and Novel CVAF Constructions

Holomorphic Families

Classic holomorphic activations include tanh(z)\tanh(z), σ(z)=1/(1+ez)\sigma(z)=1/(1+e^{-z}), and polynomials P(z)P(z). These retain all Cauchy–Riemann structure and are critical when true complex-differentiable behavior is needed, e.g., for Newton-type backpropagation, but their unboundedness or poles result in numerical instability and potential blow-up for large z|z| (Corte et al., 2014, Hammad, 27 Jul 2024). Polynomial Taylor truncations have been shown to accelerate Newton methods by improving Hessian conditioning and avoiding poles (Corte et al., 2014). Entire functions, such as exponentials and sigmoids, are used sparingly because of their global behavior and instability at large moduli (Abdalla, 2023, Hammad, 27 Jul 2024).

Split and Cartesian Functions

Split functions apply classical real nonlinearities to each channel:

  • Split-ReLU: f(z)=max(0,z)+imax(0,z)f(z)=\max(0, \Re z) + i \max(0, \Im z)
  • Split-Tanh: f(z)=tanh(z)+itanh(z)f(z)=\tanh(\Re z) + i \tanh(\Im z)

Empirical studies confirm the practicality and stability of split-ReLU/tanh for generic tasks and show that, due to the lack of rotational equivariance, they may underperform on phase-sensitive tasks (Barrachina et al., 2023, Bassey et al., 2021, Abdalla, 2023, Hammad, 27 Jul 2024). Best practices suggest employing split-tanh or split-ELU for generic signal processing and image applications where phase invariance is not critical (Hammad, 27 Jul 2024).

Magnitude–Phase and Phase-Preserving Constructions

A broad range of nonholomorphic, phase-aware CVAFs dominate recent literature due to their favorable inductive bias for complex signal domains:

  • modReLU: f(z)=ReLU(z+b)zzf(z) = \text{ReLU}(|z|+b) \frac{z}{|z|} (phase-preserving, magnitude-gating, widely used in unitary RNNs and stable for deep nets) (Caragea et al., 2021, Abdalla, 2023, Bassey et al., 2021). It is continuous, piecewise smooth, 1-Lipschitz, but non-differentiable on z=1|z|=1, and not holomorphic.
  • Cardioid: f(z)=1+cos(argz)2zf(z) = \frac{1+\cos(\arg z)}{2}z, phase gating by θ, recovers ReLU on the real axis, smooth phase interpolation, shown to outperform split and magnitude gating on MRI fingerprinting (Virtue et al., 2017, Abdalla, 2023, Bassey et al., 2021).
  • zReLU: f(z)=zf(z)=z if (z)>0\Re(z)>0 and (z)>0\Im(z)>0, $0$ otherwise; quadrant gating, non-smooth (Trabelsi et al.).
  • Reciprocal–polynomial and polynomial–reciprocal families: Zo(P)(z)=αz+knzn/(zq+ϵ)Z_o^{(P)}(z)=\alpha z + \sum k_n z^n/(|z|^q+\epsilon), enabling parameter-efficient, phase-aware nonlinear shaping; phase preserved by real-valued gain applied to zz (Young et al., 4 Apr 2025).

Amplitude–phase and phase-sensitive activations are now diverse, including parametric forms such as modSoftplus, modSwish, CAP–type piecewise/composite gates, with explicit parameterization for sharpness and shape (Hammad, 27 Jul 2024, Young et al., 4 Apr 2025). These forms enhance expressive power and parameter efficiency, evidenced by significant improvements on benchmarks such as AudioMNIST with hybrid real–complex architectures (65% loss reduction, 54% parameter savings) (Young et al., 4 Apr 2025).

3. Nonparametric and Adaptive CVAFs

Kernel Activation Functions (KAFs) extend CVAF expressivity beyond fixed formulas, implementing nonparametric neuron-wise complex function expansions (Scardapane et al., 2018, Scardapane et al., 2019).

  • KAF: f(z)=nαnK(z,dn)f(z)=\sum_n\alpha_n K(z,d_n) for complex kernel KK and fixed dictionary {dn}\{d_n\}
  • Widely Linear KAFs (WL-KAFs): fWL(z)=nαnK(z,dn)+αnK~(z,dn)f_{\text{WL}}(z)=\sum_n \alpha_n K(z,d_n) + \alpha_n^* \widetilde K(z,d_n), capturing full affine complex function closure.

KAFs, via positive-definite kernels in C\mathbb{C}, approximate arbitrary smooth nonlinearities for each neuron and can outperform fixed activations (e.g., split-KAF: 97.2% accuracy vs. modReLU: 95.9% on complex-MNIST) (Scardapane et al., 2018). Widely linear extensions double the representational capacity without increasing parameter count, providing superior accuracy and faster convergence (Scardapane et al., 2019).

4. Universal Approximation and Theoretical Guarantees

Universal approximation theory in the complex domain diverges qualitatively from the real case due to analyticity, polyharmonicity, and the richer function structure on C\mathbb{C} (Voigtlaender, 2020, Geuchen et al., 2023). Key results include:

  • Deep CVNNs (depth 2\geq2) are universal if the activation is neither polynomial in (z,zˉ)(z,\bar z), holomorphic, nor antiholomorphic. This admits a vastly broader class than in the real case, allowing almost any non-holomorphic, non-affine, nonlinear CVAF (Voigtlaender, 2020, Geuchen et al., 2023).
  • Shallow CVNNs (depth 1): universality requires the real or imaginary part of the activation is not (almost) polyharmonic. For non-polyharmonic, smooth σ\sigma, shallow networks give dense approximation of all continuous functions.
  • modReLU and magnitude–phase activations satisfy universality for deep networks, with optimal approximation rates matching the real ReLU case under a doubling of the dimension: error ϵ\epsilon achieved with O(ϵ2d/nlog2(1/ϵ))O(\epsilon^{-2d/n}\log^2(1/\epsilon)) weights and O(log(1/ϵ))O(\log(1/\epsilon)) depth for CnC^n targets on [0,1]2d[0,1]^{2d} domains (Caragea et al., 2021).
  • Kernel-based and adaptive neuron-wise activations also admit universal approximation via RKHS theory for suitable kernels (Scardapane et al., 2018).

Thus, almost all practical nonholomorphic, non-polynomial CVAFs support universal approximation in deep architectures; analytic activations are insufficient for full expressivity.

5. Complex Backpropagation and Implementation

The choice of CVAF is central to the training dynamics and gradient flow through Wirtinger calculus. For nonholomorphic ff, the relevant derivatives are

fz=12(uxiuy)+12(vxivy)i\frac{\partial f}{\partial z} = \frac12\left(\frac{\partial u}{\partial x} - i\frac{\partial u}{\partial y}\right) + \frac12\left(\frac{\partial v}{\partial x} - i\frac{\partial v}{\partial y}\right)i

with corresponding derivatives wrt zˉ\bar z; analytic activations collapse the /zˉ\partial/\partial\bar z term (Abdalla, 2023, Hammad, 27 Jul 2024). Networks are trained via CR-calculus-based updates, with different backpropagation schemes for holomorphic and generic CVAFs (Abdalla, 2023, Barrachina et al., 2023).

Complex-valued libraries (e.g., cvnn toolbox, TensorFlow/PyTorch 1.6+) now provide automatic Wirtinger differentiation and proper weight initialization (complex Xavier, scaling variance by $1/2$ for both real and imaginary parts) (Barrachina et al., 2023).

6. Empirical Benchmarks and Applications

Empirical evaluations consistently demonstrate the superiority of phase-preserving and parameter-adaptive CVAFs on tasks with complex, oscillatory, or wave-typed signals:

  • MRI fingerprinting: Cardioid and magnitude–phase CVAFs yield distinctly lower NRMSE for tissue property recovery, outperforming both real-valued and naive complex activations (Virtue et al., 2017).
  • Channel equalization, FFT-MNIST, and wind prediction: KAF and its widely linear variant achieve lower MSE, higher R2R^2, and improved classification accuracy relative to fixed nonlinearities (Scardapane et al., 2018, Scardapane et al., 2019).
  • Speech recognition (AudioMNIST) and hybrid architectures: Parameter-efficient polynomial–reciprocal phase-gated CVAFs provide up to 65% reduction in cross-entropy loss, especially in low-SNR regimes (Young et al., 4 Apr 2025).
  • Function approximation under data scarcity: Holomorphic activations (e.g., CauchyNet with f(z)=1/zf(z)=1/z) deliver compact models for rational/oscillatory signals, halving MAE relative to ReLU-MLP, with minimized parameter count (Zhang et al., 11 Oct 2025).

Where phase information is task-critical, such as in communications, medical imaging, or any frequency-domain modeling, phase-preserving and magnitude–phase CVAFs are empirically favored. Split and holomorphic activations remain competitive in generic domains or where stability, simplicity, or analytic gradients dictate.

7. Open Directions, Limitations, and Practical Recommendations

Several challenges persist:

  • Bounded holomorphic CVAFs are impossible per Liouville; practical implementations of analytic activations require radius control or domain restriction (Corte et al., 2014, Hammad, 27 Jul 2024).
  • Nonholomorphic but smooth, phase-aware CVAFs—magnitude–phase gates, polynomial–reciprocal, CAP-family—balance expressivity, stability, and universal approximation, and are optimal for signal and waveform-related domains (Young et al., 4 Apr 2025, Abdalla, 2023, Hammad, 27 Jul 2024).
  • Nonparametric and adaptive neuron-wise CVAFs (KAF/WL-KAF) offer maximal flexibility at the cost of per-neuron parameter scaling; optimal kernel choice and dictionary grid remain active research problems (Scardapane et al., 2018, Scardapane et al., 2019).
  • Gradient flow and vanishing/exploding regimes can arise at zero-magnitude (phase-division) or at poles (entire/holo activations); practical implementations must regularize, offset denominators, or penalize imaginary output components as needed (Zhang et al., 11 Oct 2025, Barrachina et al., 2023).

Best practices are now well established:

Emerging research investigates locally-bounded holomorphic activations, adaptive phase–amplitude transforms, and extensions into hypercomplex (quaternionic) neural architectures (Hammad, 27 Jul 2024).


References:

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Complex-Valued Activation Functions (CVAFs).