6533b82dfe1ef96bd12914b6
RESEARCH PRODUCT
Denoising Autoencoders for Fast Combinatorial Black Box Optimization
Malte Probstsubject
FOS: Computer and information sciencesArtificial neural networkI.2.6business.industryFitness approximationComputer scienceNoise reductionI.2.8MathematicsofComputing_NUMERICALANALYSISComputer Science - Neural and Evolutionary ComputingMachine learningcomputer.software_genreAutoencoderOrders of magnitude (bit rate)Estimation of distribution algorithmBlack boxComputingMethodologies_SYMBOLICANDALGEBRAICMANIPULATIONNeural and Evolutionary Computing (cs.NE)Artificial intelligencebusinessI.2.6; I.2.8computerdescription
Estimation of Distribution Algorithms (EDAs) require flexible probability models that can be efficiently learned and sampled. Autoencoders (AE) are generative stochastic networks with these desired properties. We integrate a special type of AE, the Denoising Autoencoder (DAE), into an EDA and evaluate the performance of DAE-EDA on several combinatorial optimization problems with a single objective. We asses the number of fitness evaluations as well as the required CPU times. We compare the results to the performance to the Bayesian Optimization Algorithm (BOA) and RBM-EDA, another EDA which is based on a generative neural network which has proven competitive with BOA. For the considered problem instances, DAE-EDA is considerably faster than BOA and RBM-EDA, sometimes by orders of magnitude. The number of fitness evaluations is higher than for BOA, but competitive with RBM-EDA. These results show that DAEs can be useful tools for problems with low but non-negligible fitness evaluation costs.
year | journal | country | edition | language |
---|---|---|---|---|
2015-03-06 | Proceedings of the Companion Publication of the 2015 Annual Conference on Genetic and Evolutionary Computation |