Papers
Topics
Authors
Recent
Search
2000 character limit reached

Golden Eagle Genetic Optimization (GEGO)

Updated 28 January 2026
  • GEGO is a hybrid metaheuristic that integrates physics-inspired attack-cruise dynamics with genetic operators to enhance search in high-dimensional spaces.
  • It maintains population diversity by embedding selection, crossover, and mutation into the standard Golden Eagle Optimization framework.
  • Benchmark results demonstrate GEGO’s improved performance in hyperparameter tuning and fog computing scheduling compared to traditional methods.

Golden Eagle Genetic Optimization (GEGO) is a hybrid metaheuristic optimization framework developed to address computationally expensive and high-dimensional search problems where classic derivative-based or single-mechanism metaheuristics often underperform. GEGO was explicitly designed for tasks such as hyperparameter tuning under resource constraints and discrete combinatorial scheduling in fog computing, integrating the physics-inspired population movement of Golden Eagle Optimization (GEO) with genetic algorithm (GA) operators—selection, crossover, and mutation. The approach aims to enforce sustained population diversity throughout the search, inhibit premature convergence typical of swarm-based algorithms, and provide robust exploration-exploitation dynamics with minimal computational overhead (Nazarians et al., 21 Jan 2026, Sirjani et al., 9 Sep 2025).

1. Theoretical Foundations and Hybrid Motivation

Golden Eagle Optimization (GEO) is a swarm-based metaheuristic algorithm inspired by the hunting strategies of golden eagles, alternating between exploration (“cruise”) and exploitation (“attack”) in the solution space. GEO demonstrates strong global search capability in continuous domains but often suffers from loss of diversity and premature convergence, especially in later iterations or under small population/resource constraints.

Genetic Algorithms (GA), in contrast, use explicit selection, crossover, and mutation to maintain genetic diversity and allow for long-range jumps within the search space, but can lack effective guidance toward promising regions, leading to inefficiency in purely high-dimensional, continuous, or rugged landscapes. Hybridizing GEO with embedded genetic operations addresses these drawbacks by periodically injecting diversity and global information exchange directly into the optimizing population, thus balancing exploitation and exploration more effectively within limited computational budgets (Nazarians et al., 21 Jan 2026).

2. Algorithmic Structure and Mathematical Formulation

GEGO maintains a population X={xiRDi=1,,N}X = \{x_i \in \mathbb{R}^D \mid i = 1,\ldots,N\} of NN individuals in DD-dimensional search space. The core GEGO iterative loop comprises the following components:

  1. Attack–Cruise Update (Standard GEO):
    • Each solution xitx_i^t updates its position using two principal vectors:
      • Attack vector: Ai=Xfxit\mathbf{A}_i = X_f^* - x_i^t, where XfX_f^* is a randomly selected “prey” (other population member)
      • Cruise vector: Ci\mathbf{C}_i is orthogonal to Ai\mathbf{A}_i, introducing lateral exploration.
    • Displacement step:

    Δxit=r1pa(t)AiAi+r2pc(t)CiCi,r1,r2U(0,1)\Delta x_i^t = r_1\,p_a(t)\,\frac{\mathbf{A}_i}{\|\mathbf{A}_i\|} + r_2\,p_c(t)\,\frac{\mathbf{C}_i}{\|\mathbf{C}_i\|}, \qquad r_1, r_2 \sim U(0,1)

  • Time-varying attack pa(t)p_a(t) and cruise pc(t)p_c(t) coefficients gradually modulate exploration versus exploitation.
  1. Periodic Genetic Phase (Embedded Operators):
    • Every fgenf_{gen} iterations, the current solutions are encoded as binary chromosomes.
    • Selection uses binary tournaments: from population, pairs are drawn; fitter individuals are parents.
    • Crossover (linear, rate PcP_c): Offspring are created as o=αp1+(1α)p2o = \alpha p^1 + (1-\alpha) p^2 (αU(0,1)\alpha \sim U(0,1)).
    • Mutation (rate PmP_m): Each bit has probability PmP_m of flipping; bit-flips decode to perturbations ±δ\pm \delta in continuous space.
    • Offspring replace parents if they confer fitness improvement.
    • Genetic operator frequency and rates are hyperparameters and allow explicit diversity regulation (Nazarians et al., 21 Jan 2026).

The pseudocode, initialization and update steps, and per-stage computational complexity are formalized to match standard O(N(D+F)N\cdot(D+F)) cost per iteration, where FF denotes fitness evaluation, ensuring practical compatibility with standalone GA and GEO (see table below).

Algorithm Complexity per Iteration Diversity Source
GEO O(N(D+F)N\cdot(D+F)) Stochastic movement
GA O(N(D+F)N\cdot(D+F)) Selection/crossover/mutation
GEGO O(N(D+F)N\cdot(D+F))+O(genetic) Both (embedded genetic ops)

3. Parameterization, Computational Characteristics, and Implementation

GEGO retains low parameterization overhead with the following tunable hyperparameters:

  • NN: population size (20–50 typically)
  • TT: maximum iterations (100–1000)
  • pa0p_a^0, paTp_a^T: attack propensity (e.g., [0.5, 2])
  • pc0p_c^0, pcTp_c^T: cruise propensity (e.g., [1, 0.5])
  • fgenf_{gen}: genetic phase frequency (e.g., every 3–5 iterations)
  • PcP_c: crossover probability (1\approx 1)
  • PmP_m: mutation probability (low, e.g., 0.001 per bit)

The embedding of genetic operators does not alter asymptotic computational complexity relative to GEO or GA alone, incurring only minor constant-factor overhead from encoding/decoding and bitwise genetic manipulations. In discrete combinatorial scheduling, such as task allocation in fog computing, GEGO (introduced as IGEO in (Sirjani et al., 9 Sep 2025)) adapts by using hybrid GEO/genetic moves to generate discrete assignments, exploiting mutation for exploration and crossover for exploitation contingent on the sign of each step vector component.

4. Empirical Performance on Benchmarks and Real-World Problems

Comprehensive benchmarking on the CEC2017 suite and practical application to neural network hyperparameter tuning and fog computing scheduling tasks demonstrate that GEGO consistently outperforms both constituent algorithms in robustness and solution quality.

CEC2017 Suite Results (Selected):

  • On 30D unimodal/multimodal benchmarks (N=20, T=100), GEGO achieves lower or comparable mean and standard deviation of best-found objective values compared to GEO and GA (e.g., EggHolder: GEGO 936.23±31.63-936.23\pm31.63 vs. GEO 927.90±62.46-927.90\pm62.46, GA 687.88±121.62-687.88\pm121.62).
  • On 100D composite functions (N=50, T=1000), GEGO outperforms GEO in 8/10 functions and consistently surpasses GA (e.g., CF3: GEGO 2797.56±388.292797.56\pm388.29, GEO 3248.38±1043.473248.38\pm1043.47, GA 4577.79±655.394577.79\pm655.39) (Nazarians et al., 21 Jan 2026).

Hyperparameter Tuning (MNIST, D=10):

  • Under N=10, T=15, GEGO yields highest mean test accuracy (97.62%±0.1697.62\%\pm0.16) and lowest loss (0.06575), surpassing both GEO and GA across all ten trials.
  • Standard deviation is higher for GEGO, reflecting greater exploration and an ability to escape local optima where standalone GEO stagnates.

Fog Computing Task Scheduling:

  • Genetic-discretized GEO (IGEO) improves convergence by 15–20% over pure GEO in discrete assignment.
  • In 600-task scheduling on 20 heterogeneous fog nodes, IGEO reduces total energy consumption (6.3\sim6.3 kJ vs. ETFC 6.9\sim6.9 kJ) and deadline violation time (IGEO 750\sim750 ms, ETFC 950\sim950 ms), with further reductions under the RL-augmented RIGEO extension (energy 6.0\sim6.0 kJ, deadline violation 600\sim600 ms) (Sirjani et al., 9 Sep 2025).

5. Exploration–Exploitation Balance and Diversity Maintenance

Periodic embedding of genetic crossover and mutation enables GEGO to sustain high population diversity. Crossover acts as a global information exchange mechanism, while mutation provides stochastic exploration, serving as an effective stagnation breaker. Empirical evidence includes the observed avoidance of stagnation in multimodal functions and broader exploration in hyperparameter configurations for neural networks. A plausible implication is that the higher standard deviation in GEGO optimization outcomes, relative to GEO, signifies more effective exploration of the search space and an increased probability of identifying better global optima (Nazarians et al., 21 Jan 2026).

6. Extensions: Discrete Domains and Reinforcement Learning Integration

Editor’s term: GEGO (Discrete) or IGEO denotes the adaptation to combinatorial scheduling, where genetic operators are conditioned on the attack–cruise update directionality. Negative step components trigger mutation for exploration, while positive ones prefer crossover for exploitation.

The Reinforcement Improved GEO (RIGEO) extension integrates reinforcement learning (RL) for situations where rapid task deadline satisfaction is mandatory. RIGEO classifies fog nodes by traffic; low-traffic nodes utilize IGEO, and high-traffic, high-criticality tasks are scheduled by a Q-table-based RL agent, resulting in further improvements in system response time and energy under mixed workload conditions (Sirjani et al., 9 Sep 2025).

7. Limitations and Prospects

GEGO demonstrates robust performance across mixed continuous/discrete optimization scenarios and constrained-resource regimes, but shows mixed comparative performance versus more adaptive differentially evolving metaheuristics such as L-SHADE particularly for very large composite fitness landscapes.

Future research directions include large-scale statistical studies with formal Wilcoxon/Friedman significance testing, adaptation to deeper model architectures and larger datasets (e.g., CIFAR, ImageNet), and dynamic self-tuning of genetic operator frequencies and rates, as well as further investigation into hybrid metaheuristic–RL systems for distributed optimization in edge and fog computing (Nazarians et al., 21 Jan 2026, Sirjani et al., 9 Sep 2025).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Golden Eagle Genetic Optimization (GEGO).