Title
Nonlinear Least Squares Optimization of Constants in Symbolic Regression
Abstract
In this publication a constant optimization approach for symbolic regression by genetic programming is presented. The Levenberg-Marquardt algorithm, a nonlinear, least-squares method, tunes numerical values of constants in symbolic expression trees to improve their fit to observed data. The necessary gradient information for the algorithm is obtained by automatic programming, which efficiently calculates the partial derivatives of symbolic expression trees. The performance of the methodology is tested for standard and offspring selection genetic programming on four well-known benchmark datasets. Although constant optimization includes an overhead regarding the algorithm runtime, the achievable quality increases significantly compared to the standard algorithms. For example, the average coefficient of determination on the Poly-10 problem changes from 0.537 without constant optimization to over 0.8 with constant optimization enabled. In addition to the experimental results, the effect of different parameter settings like the number of individuals to be optimized is detailed.
Year
DOI
Venue
2013
10.1007/978-3-642-53856-8_53
EUROCAST (1)
Keywords
Field
DocType
genetic programming,automatic differentiation
Standard algorithms,Automatic differentiation,Genetic programming,Partial derivative,Theoretical computer science,Artificial intelligence,Symbolic regression,Automatic programming,Mathematical optimization,Meta-optimization,Algorithm,Coefficient of determination,Machine learning,Mathematics
Conference
Volume
ISSN
Citations 
8111
0302-9743
1
PageRank 
References 
Authors
0.49
6
4
Name
Order
Citations
PageRank
Michael Kommenda19715.58
Michael Affenzeller233962.47
Gabriel Kronberger319225.40
Stephan M. Winkler414022.90