Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Local Optimization Often is Ill-conditioned in Genetic Programming for Symbolic Regression (2209.00942v1)

Published 2 Sep 2022 in cs.NE

Abstract: Gradient-based local optimization has been shown to improve results of genetic programming (GP) for symbolic regression. Several state-of-the-art GP implementations use iterative nonlinear least squares (NLS) algorithms such as the Levenberg-Marquardt algorithm for local optimization. The effectiveness of NLS algorithms depends on appropriate scaling and conditioning of the optimization problem. This has so far been ignored in symbolic regression and GP literature. In this study we use a singular value decomposition of NLS Jacobian matrices to determine the numeric rank and the condition number. We perform experiments with a GP implementation and six different benchmark datasets. Our results show that rank-deficient and ill-conditioned Jacobian matrices occur frequently and for all datasets. The issue is less extreme when restricting GP tree size and when using many non-linear functions in the function set.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (20)
  1. M. Kommenda, B. Burlacu, G. Kronberger, and M. Affenzeller, “Parameter identification for symbolic regression using nonlinear least squares,” Genetic Programming and Evolvable Machines, vol. 21, no. 3, pp. 471–501, Sep 2020. [Online]. Available: https://doi.org/10.1007/s10710-019-09371-3
  2. G. Dick, C. A. Owen, and P. A. Whigham, “Feature standardisation and coefficient optimisation for effective symbolic regression,” in Proceedings of the 2020 Genetic and Evolutionary Computation Conference, ser. GECCO ’20.   New York, NY, USA: Association for Computing Machinery, 2020, p. 306–314. [Online]. Available: https://doi.org/10.1145/3377930.3390237
  3. M. Virgolin and P. Bosman, “Coefficient mutation in the gene-pool optimal mixing evolutionary algorithm for symbolic regression,” in Proceedings of the 2022 Genetic and Evolutionary Computation Conference Companion, ser. GECCO ’22.   Boston, USA: Association for Computing Machinery, 9-13 July 2022.
  4. G. I. Aldeia and F. de França, “Interaction-Transformation evolutionary algorithm with coefficients optimization,” in Proceedings of the 2022 Genetic and Evolutionary Computation Conference Companion, ser. GECCO ’22.   Boston, USA: Association for Computing Machinery, 9-13 July 2022.
  5. W. La Cava and J. H. Moore, “Learning feature spaces for regression with genetic programming,” Genetic Programming and Evolvable Machines, vol. 21, no. 3, pp. 433–467, Sep. 2020.
  6. K. Levenberg, “A method for the solution of certain non-linear problems in least squares,” Quarterly of Applied Mathematics, vol. 2, no. 2, pp. 164–168, 1944.
  7. D. W. Marquardt, “An algorithm for least-squares estimation of nonlinear parameters,” Journal of the Society for Industrial and Applied Mathematics, vol. 11, no. 2, pp. 431–441, 1963.
  8. A. Topchy, W. F. Punch et al., “Faster genetic programming based on local gradient search of numeric leaf values,” in Proceedings of the genetic and evolutionary computation conference (GECCO-2001), vol. 155162.   Morgan Kaufmann San Francisco, CA, 2001.
  9. L. Trujillo, O. Schütze, P. Legrand et al., “Evaluating the effects of local search in genetic programming,” in EVOLVE-A Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation V.   Springer, 2014, pp. 213–228.
  10. E. Z-Flores, L. Trujillo, O. Schütze, and P. Legrand, “Evaluating the effects of local search in genetic programming,” in EVOLVE - A Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation V, A.-A. Tantar, E. Tantar, J.-Q. Sun, W. Zhang, Q. Ding, O. Schütze, M. Emmerich, P. Legrand, P. Del Moral, and C. A. Coello Coello, Eds.   Cham: Springer International Publishing, 2014, pp. 213–228.
  11. L. Trujillo, P. S. Juárez-Smith, P. Legrand, S. Silva, M. Castelli, L. Vanneschi, O. Schütze, L. Muñoz et al., “Local search is underused in genetic programming,” in Genetic Programming Theory and Practice XIV.   Springer, 2018, pp. 119–137.
  12. J. J. Moré, B. S. Garbow, and K. E. Hillstrom, “User guide for minpack-1,” CM-P00068642, Tech. Rep., 1980.
  13. G. Guennebaud, B. Jacob et al., “Eigen v3,” http://eigen.tuxfamily.org, 2010.
  14. D. P. O’Leary and B. W. Rust, “Variable projection for nonlinear least squares problems,” Computational Optimization and Applications, vol. 54, no. 3, pp. 579–593, 2013.
  15. E. J. Vladislavleva, G. F. Smits, and D. Den Hertog, “Order of nonlinearity as a complexity measure for models generated by symbolic regression via pareto genetic programming,” IEEE Transactions on Evolutionary Computation, vol. 13, no. 2, pp. 333–349, 2008.
  16. T. F. Brooks, D. S. Pope, and M. A. Marcolini, “Airfoil self-noise and prediction,” NASA, Tech. Rep. NASA Reference Publication 1218, 1989.
  17. L. Pagie and P. Hogeweg, “Evolutionary consequences of coevolving targets,” Evolutionary Computation, vol. 5, no. 4, pp. 401–418, 1997.
  18. R. Poli and W. B. Langdon, “Running genetic programming backwards,” in Genetic Programming Theory and Practice III.   Springer, 2006, pp. 125–140.
  19. J. Ni, R. H. Drieberg, and P. I. Rockett, “The use of an analytic quotient operator in genetic programming,” IEEE Transactions on Evolutionary Computation, vol. 17, no. 1, pp. 146–152, Feb. 2013.
  20. F. O. de Franca and G. S. I. Aldeia, “Interaction-transformation evolutionary algorithm for symbolic regression,” Evolutionary Computation, vol. 29, no. 3, pp. 367–390, Fall 2021.
Citations (5)

Summary

We haven't generated a summary for this paper yet.