Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Sharp Bounds for Genetic Drift in Estimation of Distribution Algorithms (1910.14389v2)

Published 31 Oct 2019 in cs.NE

Abstract: Estimation of Distribution Algorithms (EDAs) are one branch of Evolutionary Algorithms (EAs) in the broad sense that they evolve a probabilistic model instead of a population. Many existing algorithms fall into this category. Analogous to genetic drift in EAs, EDAs also encounter the phenomenon that updates of the probabilistic model not justified by the fitness move the sampling frequencies to the boundary values. This can result in a considerable performance loss. This paper proves the first sharp estimates of the boundary hitting time of the sampling frequency of a neutral bit for several univariate EDAs. For the UMDA that selects $\mu$ best individuals from $\lambda$ offspring each generation, we prove that the expected first iteration when the frequency of the neutral bit leaves the middle range $[\tfrac 14, \tfrac 34]$ and the expected first time it is absorbed in 0 or 1 are both $\Theta(\mu)$. The corresponding hitting times are $\Theta(K2)$ for the cGA with hypothetical population size $K$. This paper further proves that for PBIL with parameters $\mu$, $\lambda$, and $\rho$, in an expected number of $\Theta(\mu/\rho2)$ iterations the sampling frequency of a neutral bit leaves the interval $[\Theta(\rho/\mu),1-\Theta(\rho/\mu)]$ and then always the same value is sampled for this bit, that is, the frequency approaches the corresponding boundary value with maximum speed. For the lower bounds implicit in these statements, we also show exponential tail bounds. If a bit is not neutral, but neutral or has a preference for ones, then the lower bounds on the times to reach a low frequency value still hold. An analogous statement holds for bits that are neutral or prefer the value zero.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (27)
  1. W. Zheng, G. Yang, and B. Doerr, “Working principles of binary differential evolution,” in Genetic and Evolutionary Computation Conference, GECCO 2018.   ACM, 2018, pp. 1103–1110.
  2. J. S. De Bonet, C. L. Isbell Jr., and P. A. Viola, “Mimic: Finding optima by estimating probability densities,” in Advances in Neural Information Processing Systems, NIPS 1996.   MIT Press, 1996, pp. 424–430.
  3. M. Pelikan and H. Mühlenbein, “The bivariate marginal distribution algorithm,” in Advances in Soft Computing.   Springer, 1999, pp. 521–535.
  4. H. Mühlenbein and T. Mahnig, “FDA - a scalable evolutionary algorithm for the optimization of additively decomposed functions,” Evolutionary Computation, vol. 7, no. 4, pp. 353–376, 1999.
  5. G. R. Harik, F. G. Lobo, and K. Sastry, “Linkage learning via probabilistic modeling in the extended compact genetic algorithm (ECGA),” in Scalable Optimization via Probabilistic Modeling.   Springer, 2006, pp. 39–61.
  6. S. Baluja, “Population-based incremental learning: A method for integrating genetic search based function optimization and competitive learning,” Carnegie Mellon University, Pittsburgh, PA, USA, Tech. Rep., 1994.
  7. S. Baluja and R. Caruana, “Removing the genetics from the standard genetic algorithm,” in International Conference on Machine Learning, ICML 1995.   Elsevier, 1995, pp. 38–46.
  8. H. Mühlenbein and G. Paass, “From recombination of genes to the estimation of distributions I. Binary parameters,” in International Conference on Parallel Problem Solving from Nature, PPSN 1996.   Springer, 1996, pp. 178–187.
  9. F. Neumann, D. Sudholt, and C. Witt, “A few ants are enough: ACO with iteration-best update,” in Genetic and Evolutionary Computation Conference, GECCO 2010.   ACM, 2010, pp. 63–70.
  10. G. R. Harik, F. G. Lobo, and D. E. Goldberg, “The compact genetic algorithm,” in International Conference on Evolutionary Computation, ICEC 1998.   IEEE, 1998, pp. 523–528.
  11. M. Krejca and C. Witt, “Theory of estimation-of-distribution algorithms,” in Theory of Evolutionary Computation: Recent Developments in Discrete Optimization, B. Doerr and F. Neumann, Eds.   Springer, 2020, pp. 405–442.
  12. K. Motoo, “Diffusion models in population genetics,” Journal of Applied Probability, vol. 1, no. 2, pp. 177–232, 1964.
  13. H. Asoh and H. Mühlenbein, “On the mean convergence time of evolutionary algorithms without selection and mutation,” in International Conference on Parallel Problem Solving from Nature, PPSN 1994.   Springer, 1994, pp. 88–97.
  14. C. González, J. A. Lozano, and P. Larrañaga, “The convergence behavior of the PBIL algorithm: a preliminary approach,” in International Conference on Artificial Neural Nets and Genetic Algorithms, ICANNGA 2001.   Springer, 2001, pp. 228–231.
  15. S. Droste, “A rigorous analysis of the compact genetic algorithm for linear functions,” Natural Computing, vol. 5, pp. 257–283, 2006.
  16. A. Costa, O. D. Jones, and D. Kroese, “Convergence properties of the cross-entropy method for discrete optimization,” Operations Research Letters, vol. 35, no. 5, pp. 573–580, 2007.
  17. C. Witt, “Upper bounds on the running time of the univariate marginal distribution algorithm on OneMax,” Algorithmica, vol. 81, pp. 632–667, 2019.
  18. J. Lengler, D. Sudholt, and C. Witt, “Medium step sizes are harmful for the compact genetic algorithm,” in Genetic and Evolutionary Computation Conference, GECCO 2018.   ACM, 2018, pp. 1499–1506.
  19. T. Friedrich, T. Kötzing, and M. S. Krejca, “EDAs cannot be balanced and stable,” in Genetic and Evolutionary Computation Conference, GECCO 2016.   ACM, 2016, pp. 1139–1146.
  20. D. Sudholt and C. Witt, “On the choice of the update strength in estimation-of-distribution algorithms and ant colony optimization,” Algorithmica, vol. 81, no. 4, pp. 1450–1489, 2019.
  21. M. S. Krejca and C. Witt, “Lower bounds on the run time of the Univariate Marginal Distribution Algorithm on OneMax,” Theoretical Computer Science, 2020, to appear.
  22. J. He and X. Yao, “Drift analysis and average time complexity of evolutionary algorithms,” Artificial Intelligence, vol. 127, pp. 51–81, 2001.
  23. B. Doerr, “A tight runtime analysis for the cGA on jump functions: EDAs can cross fitness valleys at no extra cost,” in Genetic and Evolutionary Computation Conference, GECCO 2019.   ACM, 2019, pp. 1488–1496.
  24. C. McDiarmid, “Concentration,” in Probabilistic Methods for Algorithmic Discrete Mathematics.   Berlin: Springer, 1998, pp. 195–248.
  25. B. Doerr, E. Happ, and C. Klein, “Tight analysis of the (1+1)-EA for the single source shortest path problem,” Evolutionary Computation, vol. 19, pp. 673–691, 2011.
  26. B. Doerr, “Analyzing randomized search heuristics via stochastic domination,” Theoretical Computer Science, vol. 773, pp. 115–137, 2019.
  27. P. K. Lehre and P. T. H. Nguyen, “On the limitations of the univariate marginal distribution algorithm to deception and where bivariate EDAs might help,” in Foundations of Genetic Algorithms, FOGA 2019, 2019, pp. 154–168.
Citations (34)

Summary

We haven't generated a summary for this paper yet.