6533b854fe1ef96bd12ae7cf

RESEARCH PRODUCT

DAE-GP

Franz RothlaufDirk SchweimDavid Wittenberg

subject

education.field_of_studyArtificial neural networkbusiness.industryComputer scienceOffspringPopulationProbabilistic logicGenetic programmingStatistical model0102 computer and information sciences02 engineering and technologyMachine learningcomputer.software_genre01 natural sciencesTree (data structure)Estimation of distribution algorithm010201 computation theory & mathematics0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingArtificial intelligencebusinesseducationcomputerMetaheuristic

description

Estimation of distribution genetic programming (EDA-GP) algorithms are metaheuristics where sampling new solutions from a learned probabilistic model replaces the standard mutation and recombination operators of genetic programming (GP). This paper presents DAE-GP, a new EDA-GP which uses denoising autoencoder long short-term memory networks (DAE-LSTMs) as probabilistic model. DAE-LSTMs are artificial neural networks that first learn the properties of a parent population by mapping promising candidate solutions to a latent space and reconstructing the candidate solutions from the latent space. The trained model is then used to sample new offspring solutions. We show on a generalization of the royal tree problem that DAE-GP outperforms standard GP and that performance differences increase with higher problem complexity. Furthermore, DAE-GP is able to create offspring with higher fitness from a learned model in comparison to standard GP. We believe that the key reason for the high performance of DAE-GP is that we do not impose any assumptions about the relationships between learned variables which is different to previous EDA-GP models. Instead, DAE-GP flexibly identifies and models relevant dependencies of promising candidate solutions.

https://doi.org/10.1145/3377930.3390180