Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Firefly Algorithm: Recent Advances and Applications (1308.3898v1)

Published 18 Aug 2013 in math.OC and cs.AI

Abstract: Nature-inspired metaheuristic algorithms, especially those based on swarm intelligence, have attracted much attention in the last ten years. Firefly algorithm appeared in about five years ago, its literature has expanded dramatically with diverse applications. In this paper, we will briefly review the fundamentals of firefly algorithm together with a selection of recent publications. Then, we discuss the optimality associated with balancing exploration and exploitation, which is essential for all metaheuristic algorithms. By comparing with intermittent search strategy, we conclude that metaheuristics such as firefly algorithm are better than the optimal intermittent search strategy. We also analyse algorithms and their implications for higher-dimensional optimization problems.

Citations (913)

Summary

  • The paper presents a comprehensive review of the Firefly Algorithm’s theoretical framework and dynamic parameter control for enhanced optimization.
  • The study showcases FA’s versatility, applying it to digital image processing, engineering design, scheduling, and neural network training.
  • The research highlights FA's efficiency, demonstrating lower function evaluations compared to GA and PSO, with promising scalability to high-dimensional problems.

Firefly Algorithm: Recent Advances and Applications

The paper "Firefly Algorithm: Recent Advances and Applications" authored by Xin-She Yang and Xingshi He provides an in-depth review of the Firefly Algorithm (FA), its theoretical underpinnings, applications, and comparisons with other metaheuristic approaches. Over the past decade, nature-inspired algorithms have been at the forefront of solving complex optimization problems, and the FA, introduced by Yang in 2007-2008, remains a prominent method within this domain.

Fundamentals of Firefly Algorithm

The Firefly Algorithm is an optimization technique based on the natural flashing behavior of fireflies. This behavior is primarily governed by three idealized rules:

  1. Unisex Attraction: Fireflies are attracted to each other without gender discrimination.
  2. Brightness and Distance: The attractiveness between fireflies is proportional to their brightness and inversely proportional to the square of the distance between them. Less bright fireflies move towards brighter ones.
  3. Objective Landscape: A firefly’s brightness correlates with the value of the objective function being optimized.

The attractiveness β\beta and movement dynamics of fireflies are mathematically defined. The key equation governing the movement of firefly ii attracted to firefly jj is:

xit+1=xit+β0eγrij2(xjtxit)+αtεitx_i^{t+1} = x_i^t + \beta_0 e^{-\gamma r^2_{ij}} (x_j^t - x_i^t) + \alpha_t \varepsilon_i^t

where β0\beta_0 is the attractiveness at r=0r = 0, γ\gamma controls the light absorption coefficient, αt\alpha_t is the randomization parameter, and εit\varepsilon_i^t represents a stochastic vector.

Parameter Settings and Complexity

The FA involves several parameters that are pivotal to its performance:

  • Randomization Parameter (αt\alpha_t): Typically, α0\alpha_0 is chosen relative to the problem’s scale, and it decreases over iterations.
  • Attractiveness (β0\beta_0): Often set to 1 for general applications.
  • Light Absorption (γ\gamma): Should be proportional to the problem’s scale, usually γ=1/L\gamma = 1/\sqrt{L}.
  • Population Size (nn): Efficient performance is generally obtained with n=25n = 25 to $40$.

The computational complexity of the FA is O(n2t)O(n^2 t) in the worst case, and it can be potentially reduced to O(ntlog(n))O(n t \log(n)) by using ranking and sorting techniques for attractiveness.

Applications of Firefly Algorithm

FA has demonstrated exemplary performance across various domains. Specifically:

  • Digital Image Processing: FA has been effectively utilized for digital image compression, showcasing superior performance in terms of computational time.
  • Engineering Design: Studies have validated FA's capability in solving nonlinear, multimodal design optimization problems.
  • Scheduling: Discrete versions of FA have shown efficiency in tackling NP-hard scheduling problems.
  • Clustering and Classification: FA has been extensively tested and proven effective in clustering tasks, often outperforming other algorithms.
  • Neural Network Training: FA has been explored as a means to optimize the training of neural networks.

Efficiency Analysis

The reasons for FA's outstanding efficiency can be attributed to two main advantages:

  1. Automatic Subdivision: FA inherently subdivides the population based on brightness and distance, enabling diversification within search spaces, which is particularly beneficial for multimodal optimization problems.
  2. Parameter Control: The FA allows dynamic tuning of parameters influencing exploration and exploitation balance, thereby expediting convergence rates.

Numerical Results and Algorithm Comparisons

Through various benchmark functions and optimization tasks, the FA has consistently required fewer function evaluations compared to Genetic Algorithms (GA) and Particle Swarm Optimization (PSO), thereby proving its computational efficiency.

Implications and Future Directions

The implications of this research underscore FA's versatility and robustness in solving complex optimization problems across various fields. Future research should focus on:

  • Higher-dimensional optimization: Refining FA for large-scale problems.
  • Hybrid Approaches: Incorporating elements from other metaheuristics to enhance FA's performance.
  • Theoretical Foundations: Further bridging the gap between empirical performance and theoretical understanding of FA's convergence behaviors.

In conclusion, the FA stands out as a powerful tool for optimization, with continued research promising to expand its applicability and efficiency in solving increasingly complex problems.