Papers
Topics
Authors
Recent
Search
2000 character limit reached

Hybrid HHO-BA Metaheuristic Feature Selection

Updated 1 January 2026
  • The paper demonstrates the integration of HHO and BA algorithms within the SRFA framework to achieve state-of-the-art feature selection performance in multimodal CT imaging.
  • Hybrid HHO-BA metaheuristic feature selection is a strategy that combines global exploration and local refinement using nature-inspired optimization techniques in deep learning.
  • The method’s efficacy is validated by high accuracy (96.23%), F1-score (95.58%), and specificity (94.83%) for early pancreatic neoplasm detection.

Hybrid HHO-BA metaheuristic feature selection is an advanced feature selection strategy that synthesizes Harris Hawks Optimization (HHO) and the Bat Algorithm (BA) within a deep learning pipeline for robust, compact, and highly discriminative feature subset identification. This approach, as implemented in the Scalable Residual Feature Aggregation (SRFA) framework, has demonstrated state-of-the-art performance in multimodal CT imaging for early pancreatic neoplasm detection, outperforming contemporary CNN and @@@@2@@@@ (Thiruvengadam et al., 29 Dec 2025).

1. Foundation: Metaheuristic Feature Selection in Deep Learning Pipelines

Feature selection is critical in complex, high-dimensional tasks to reduce overfitting, improve generalization, and enhance computational efficiency. In the SRFA framework, feature selection operates after hierarchical feature extraction and aggregation, where a dense network of residual connections produces high-dimensional composite representations. The objective is to identify a binary mask s{0,1}Ds \in \{0,1\}^D maximizing a fitness objective:

J(s)=Acc(s)λs1,λ>0J(s) = \mathrm{Acc}(s) - \lambda \|s\|_1,\quad \lambda > 0

where Acc(s)\mathrm{Acc}(s) is cross-validated classification accuracy using only the features selected by ss and s1\|s\|_1 penalizes the number of selected features (Thiruvengadam et al., 29 Dec 2025). This formulation directly addresses the trade-off between predictive performance and feature compactness.

2. Harris Hawks Optimization: Coordinated Exploration and Exploitation

HHO is a population-based metaheuristic inspired by the cooperative hunting behaviors and attack strategies of Harris hawks. The population represents solutions (feature masks) xiRDx_i \in \mathbb{R}^D, and search dynamics are controlled by a simulated prey escape energy:

E(t)=2E0(1t/T),E0U(1,1)E(t) = 2 E_0 (1 - t/T),\quad E_0 \sim \mathcal{U}(-1,1)

For high E|E| (prey energetic), hawks explore by position updates:

xi(t+1)=xr(t)r1xr(t)2r2xi(t),r1,r2U(0,1)x_i(t+1) = x_r(t) - r_1 \cdot | x_r(t) - 2 r_2 x_i(t) |,\quad r_1, r_2 \sim \mathcal{U}(0,1)

where xrx_r is a random hawk. For low E|E| (prey fatigued), exploitation mechanisms adapt the population towards the current best ("prey"):

xi(t+1)=xprey(t)EJxprey(t)xi(t),JU(0,2)x_i(t+1) = x_\text{prey}(t) - E \cdot | J \cdot x_\text{prey}(t) - x_i(t) |,\quad J \sim \mathcal{U}(0,2)

This stochastic, non-Gaussian search pattern enables efficient traversal of complex, multimodal spaces typical of biomedical feature selection (Thiruvengadam et al., 29 Dec 2025).

3. Bat Algorithm: Intensified Solution Refinement

The Bat Algorithm is inspired by microbat echolocation. Each bat holds a position xix_i, velocity viv_i, echolocation frequency fif_i, loudness AiA_i, and pulse emission rate rir_i:

  • Frequency: fi=fmin+(fmaxfmin)βf_i = f_\text{min} + (f_\text{max} - f_\text{min})\beta, βU(0,1)\beta \sim \mathcal{U}(0,1)
  • Velocity: vi(t)=vi(t1)+(xi(t1)xbest(t1))fiv_i(t) = v_i(t-1) + (x_i(t-1) - x_\text{best}(t-1)) f_i
  • Position: xi(t)=xi(t1)+vi(t)x_i(t) = x_i(t-1) + v_i(t)
  • Loudness/pulse rate updates: Ai(t+1)=αAi(t)A_i(t+1) = \alpha A_i(t), ri(t+1)=ri0[1exp(γt)]r_i(t+1) = r_{i0}[1 - \exp(-\gamma t)] with α(0,1),γ>0\alpha \in (0,1), \gamma>0

BA effectively fine-tunes near promising regions, rapidly exploring local optima defined by previous global search phases (Thiruvengadam et al., 29 Dec 2025).

4. HHO-BA Hybridization: Workflow and Algorithmic Structure

The hybridization sequence exploits HHO's global search capacity and BA's local refinement power:

  1. Initialization: Random sampling of binary masks xix_i, assigning each an initial fitness J(xi)J(x_i).
  2. HHO Optimization: Across ThhoT_\text{hho} iterations, employ HHO for recursive exploration and exploitation, updating the global best xpreyx_\text{prey} at each sweep.
  3. Candidate Selection: Retain the top MM mask solutions from HHO as seeds.
  4. BA Refinement: BA operates on these MM seeds for TbaT_\text{ba} iterations, updating according to frequency- and velocity-based motion, accepting candidates of increased fitness and probabilistically exploiting high-loudness regions.
  5. Solution Output: Output the mask ss^* that yields maximal J(s)J(s) (Thiruvengadam et al., 29 Dec 2025).

This hybrid workflow ensures both diversity in the solution pool and precision in the final selection, outperforming single-method strategies.

5. Integration with Residual Feature Aggregation and Deep Architectures

In SRFA, features are aggregated using DenseNet-121 with Residual Feature Stores (RFS), producing high-dimensional representations FaggF_\text{agg} by channel-wise concatenation across selected DenseNet blocks, normalized using 1×1 convolution and BatchNorm:

F^=BN(W1×1Concat(f1,,fk))\hat{F} = \mathrm{BN}(W_{1\times1} * \mathrm{Concat}(f_{\ell_1}, \ldots, f_{\ell_k}))

The hybrid HHO-BA feature selection operates on the flattened aggregated vector xRD\mathbf{x} \in \mathbb{R}^D produced by this step, prior to classification by a fused Vision Transformer (ViT) and EfficientNet-B3 module (Thiruvengadam et al., 29 Dec 2025).

6. Empirical Performance and Validation

On a multimodal pancreatic CT dataset, the SRFA pipeline incorporating hybrid HHO-BA feature selection attained:

  • Accuracy: 96.23%±0.42%96.23\% \pm 0.42\%
  • F1-score: 95.58%±0.35%95.58\% \pm 0.35\%
  • Specificity: 94.83%±0.50%94.83\% \pm 0.50\%

Five-fold cross-validation was employed; each metric reports a mean and 95%95\% confidence interval, with superiority over baseline CNNs and transformer architectures demonstrated (Thiruvengadam et al., 29 Dec 2025).

7. Significance, Scalability, and Prospects

Hybrid HHO-BA metaheuristic feature selection represents a modular and generalizable approach to high-dimensional, nonlinear feature selection in medical AI contexts. Its competitive advantage stems from integrating global exploration and local exploitation, leading to compact yet highly informative feature sets that can be seamlessly fused with attention-driven and convolutional architectures. The associated SRFA framework supports extensions to volumetric data and radiomics, with scalable residual aggregation mechanisms and dual metaheuristic hyperparameter tuning (SSA and GWO) providing adaptability and enhanced robustness for diverse biomedical imaging tasks (Thiruvengadam et al., 29 Dec 2025).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Hybrid HHO-BA Metaheuristic Feature Selection.