Constant optimization in symbolic regression is an important task addressed by several researchers. It has been demonstrated that continuous optimization techniques are adequate to find good values for the constants by minimizing the prediction error. In this paper, we evaluate several continuous optimization methods that can be used to perform constant optimization in symbolic regression.
The compressive strength of high-performance concrete (HPC) can be predicted by a nonlinear function of the proportions of its components. However, HPC is a complex material, and finding that nonlinear function is not trivial. Many distinct techniques such as traditional statistical regression methods and machine learning methods have been used to solve this task, reaching
Linear Genetic Programming (LGP) is an Evolutionary Computation algorithm, inspired in the Genetic Programming (GP) algorithm. Instead of using the standard tree representation of GP, LGP evolves a linear program, which causes a graph-based data flow with code reuse. LGP has been shown to outperform GP in several problems, including Symbolic Regression (SReg), and to
In this paper we investigate how to efficiently apply Approximate-Karush–Kuhn–Tucker proximity measures as stopping criteria for optimization algorithms that do not generate approximations to Lagrange multipliers. We prove that the KKT error measurement tends to zero when approaching a solution and we develop a simple model to compute the KKT error measure requiring only the
Constraint optimization problems play a crucial role in many application domains, ranging from engineering design to finance and logistics. Specific techniques are therefore needed to handle complex fitness landscapes characterized by multiple constraints. In the last decades, a number of novel meta-heuristics have been applied to constraint optimization. Among these, the Covariance Matrix Adaptation Evolution