Analysing the Robustness of Evolutionary Algorithms to Noise: Refined Runtime Bounds and an Example Where Noise is Beneficial (1812.00966v1)
Abstract: We analyse the performance of well-known evolutionary algorithms (1+1)EA and (1+$\lambda$)EA in the prior noise model, where in each fitness evaluation the search point is altered before evaluation with probability $p$. We present refined results for the expected optimisation time of the (1+1)EA and the (1+$\lambda$)EA on the function LeadingOnes, where bits have to be optimised in sequence. Previous work showed that the (1+1)EA on LeadingOnes runs in polynomial expected time if $p = O((\log n)/n2)$ and needs superpolynomial expected time if $p = \omega((\log n)/n)$, leaving a huge gap for which no results were known. We close this gap by showing that the expected optimisation time is $\Theta(n2) \cdot \exp(\Theta(\min{pn2, n}))$ for all $p \le 1/2$, allowing for the first time to locate the threshold between polynomial and superpolynomial expected times at $p = \Theta((\log n)/n2)$. Hence the (1+1)EA on LeadingOnes is much more sensitive to noise than previously thought. We also show that offspring populations of size $\lambda \ge 3.42\log n$ can effectively deal with much higher noise than known before. Finally, we present an example of a rugged landscape where prior noise can help to escape from local optima by blurring the landscape and allowing a hill climber to see the underlying gradient. We prove that in this particular setting noise can have a highly beneficial effect on performance.