0000000000724603

AUTHOR

David Wittenberg

showing 3 related works from this author

Improving estimation of distribution genetic programming with novelty initialization

2021

Estimation of distribution genetic programming (EDA-GP) replaces the standard variation operations of genetic programming (GP) by learning and sampling from a probabilistic model. Unfortunately, many EDA-GP approaches suffer from a rapidly decreasing population diversity which often leads to premature convergence. However, novelty search, an approach that searches for novel solutions to cover sparse areas of the search space, can be used for generating diverse initial populations. In this work, we propose novelty initialization and test this new method on a generalization of the royal tree problem and compare its performance to ramped half-and-half (RHH) using a recent EDA-GP approach. We f…

Computer sciencebusiness.industryGeneralizationNoveltyInitializationStatistical modelGenetic programmingVariation (game tree)Machine learningcomputer.software_genreTree (data structure)Artificial intelligencebusinesscomputerPremature convergenceProceedings of the Genetic and Evolutionary Computation Conference Companion
researchProduct

DAE-GP

2020

Estimation of distribution genetic programming (EDA-GP) algorithms are metaheuristics where sampling new solutions from a learned probabilistic model replaces the standard mutation and recombination operators of genetic programming (GP). This paper presents DAE-GP, a new EDA-GP which uses denoising autoencoder long short-term memory networks (DAE-LSTMs) as probabilistic model. DAE-LSTMs are artificial neural networks that first learn the properties of a parent population by mapping promising candidate solutions to a latent space and reconstructing the candidate solutions from the latent space. The trained model is then used to sample new offspring solutions. We show on a generalization of t…

education.field_of_studyArtificial neural networkbusiness.industryComputer scienceOffspringPopulationProbabilistic logicGenetic programmingStatistical model0102 computer and information sciences02 engineering and technologyMachine learningcomputer.software_genre01 natural sciencesTree (data structure)Estimation of distribution algorithm010201 computation theory & mathematics0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingArtificial intelligencebusinesseducationcomputerMetaheuristicProceedings of the 2020 Genetic and Evolutionary Computation Conference
researchProduct

On sampling error in evolutionary algorithms

2021

The initial population in evolutionary algorithms (EAs) should form a representative sample of all possible solutions (the search space). While large populations accurately approximate the distribution of possible solutions, small populations tend to incorporate a sampling error. A low sampling error at initialization is necessary (but not sufficient) for a reliable search since a low sampling error reduces the overall random variations in a random sample. For this reason, we have recently presented a model to determine a minimum initial population size so that the sampling error is lower than a threshold, given a confidence level. Our model allows practitioners of, for example, genetic pro…

education.field_of_studyDistribution (mathematics)Population sizePopulationStatisticsEvolutionary algorithmInitializationSmall population sizeGenetic programmingeducationConfidence intervalMathematicsProceedings of the Genetic and Evolutionary Computation Conference Companion
researchProduct