Hybrid Symbolic Regression with the Bison Seeker Algorithm
Abstract
This paper focuses on the use of the Bison Seeker Algorithm (BSA) in a hybrid genetic programming approach for the supervised machine learning method called symbolic regression. While the basic version of symbolic regression optimizes both the model structure and its parameters, the hybrid version can use genetic programming to find the model structure. Consequently, local learning is used to tune model parameters. Such tuning of parameters represents the lifetime adaptation of individuals. This paper aims to compare the basic version of symbolic regression and hybrid version with the lifetime adaptation of individuals via the Bison Seeker Algorithm. Author also investigates the influence of the Bison Seeker Algorithm on the rate of evolution in the search for function, which fits the given input-output data. The results of the current study support the fact that the local algorithm accelerates evolution, even with a few iterations of a Bison Seeker Algorithm with small populations.
References
Iba H., Sato T., and de Garis, H. 1995. Recombination guidance for numerical genetic programming. In Proceedings of 1995 IEEE International Conference on Evolutionary Computation. IEEE, pp. 97. DOI: 10.1109/ICEC.1995.489292
Nikolaev, N. Y. and Iba H. 2006. Adaptive Learning of Polynomial Networks: Genetic Programming, Backpropagation and Bayesian methods. Springer, New York, USA.
Schoenauer M., Lamy, B., and Jouve, F. 1995. Identification of Mechanical Behaviour by Genetic Programming Part II: Energy formulation. Technical report, Ecole Polytechnique, France.
Sharman, K. C., Esparcia-Alcazar, A. I., and Li, Y. 1995. Evolving Signal processing Algorithms by Genetic Programming. In First International Conference on Genetic Algorithms in Engineering Systems: Innovations and Applications, GALESIA. Volume 414, pp. 473–480, Sheffield, UK.
Koza, J. R. 1992. Genetic programming: On The Programming of Computers by Means of Natural Selection. Bradford Book, Cambridge, UK.
Poli, R., Langdon W. B., and McPhee, N. F. 2008. A Field Guide to Genetic Programming. Lulu Press, Morrisville, North Carolina, USA.
Whitley, D., Gordon, S., and Mathias, K. 1994. Lamarckian Evolution, the Baldwin Effect and Function Optimization. In Parallel Problem Solving from Nature - PPSN III. Springer, Berlin, pp. 6–15.
Le, N., Brabazon, A., and O’Neill, M. 2018. How the “Baldwin Effect” Can Guide Evolution in Dynamic Environments. Theory and Practice of Natural Computing [online]. Lecture Notes in Computer Science. Cham: Springer International Publishing, pp. 164–175. DOI: 10.1007/978-3-030-04070-3_13
Reďko, V. G., Mosalov, O. P., and Prokhorov, D. V. 2005. A Model of Evolution and Learning. Neural Networks 18, 5–6, pp. 738–745. DOI: 10.1016/j.neunet.2005.06.005
Turney, P. D. 2002. Myths and legends of the Baldwin effect. arXiv: cs/0212036. Retrieved from https://arxiv.org/abs/cs/0212036
Hinton, G. E. and Nowlan, S. J. 1987. How learning can guide evolution. Complex Systems 1, pp. 495–502.
French, R. and Messinger, A. 1994. Genes, Phenes and the Baldwin Effect: Learning and Evolution in a Simulated Population. In Artificial Life IV. MIT Press, Cambridge, MA, USA.
Topchy, A., and Punch, W. F. 2001. Faster Genetic Programming Based on Local Gradient Search of Numeric Leaf Values. In Proceedings of the 3rd Annual Conference on Genetic and Evolutionary Computation (GECCO'01). Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, pp. 155–162.
Anderson R. W. 1995. Learning and evolution: A Quantitative Genetics Approach. Journal of Theoretical Biology 175, 1, pp. 89–101. DOI: 10.1006/jtbi.1995.0123
Hashimoto, N., Kondo, N., Hatanaka, T., and Uosaki, K. 2008. Nonlinear System Modeling by Hybrid Genetic Programming. IFAC Proceedings Volumes 41, 2, pp. 4606–4611. DOI: 10.3182/20080706-5-KR-1001.00775
Raidl, G. 1998. A Hybrid GP Approach for Numerically Robust Symbolic Regression. In Proc. of the 1998 Genetic Programming Conference. Madison, Wisconsin, pp. 323–328.
Brandejský, T. 2019. Dependency of GPA-ES Algorithm Efficiency on ES Parameters Optimization Strength. In AETA 2018: Recent Advances in Electrical Engineering and Related Sciences. Springer, pp. 294–302. DOI: 10.1007/978-3-030-14907-9_29
Kennedy, J., and Eberhart, R. 1995. Particle Swarm Optimization. In: Proceedings of ICNN'95 - International Conference on Neural Networks .IEEE, pp. 1942–1948. DOI: 10.1109/ICNN.1995.488968
Rosendo, M., and Pozo, A. 2010. Applying a Discrete Particle Swarm Optimization Algorithm to Combinatorial Problems. In 2010 Eleventh Brazilian Symposium on Neural Networks. IEEE, pp. 235–240. DOI: 10.1109/SBRN.2010.48
Xing, B., and Gao, W.-J. 2014. Innovative Computational Intelligence: A Rough Guide to 134 Clever Algorithms. Springer International Publishing, New York, NY, USA.
Kazíková, A., Pluháček, M. and Šenkeřík, R. 2018. Regarding the Behavior of Bison Runners Within the Bison Algorithm. MENDEL 24, 1, pp. 63–70. DOI: 10.13164/mendel.2018.1.063
MENDEL open access articles are normally published under a Creative Commons Attribution-NonCommercial-ShareAlike (CC BY-NC-SA 4.0) https://creativecommons.org/licenses/by-nc-sa/4.0/ . Under the CC BY-NC-SA 4.0 license permitted 3rd party reuse is only applicable for non-commercial purposes. Articles posted under the CC BY-NC-SA 4.0 license allow users to share, copy, and redistribute the material in any medium of format, and adapt, remix, transform, and build upon the material for any purpose. Reusing under the CC BY-NC-SA 4.0 license requires that appropriate attribution to the source of the material must be included along with a link to the license, with any changes made to the original material indicated.