ANOVA-MOP: ANOVA Decomposition for Multiobjective Optimization
Real-world optimization problems may involve a number of computationally expensive functions with a large number of input variables. Metamodel-based optimization methods can reduce the computational costs of evaluating expensive functions, but this does not reduce the dimension of the search domain nor mitigate the curse of dimensionality effects. The dimension of the search domain can be reduced by functional anova decomposition involving Sobol' sensitivity indices. This approach allows one to rank decision variables according to their impact on the objective function values. On the basis of the sparsity of effects principle, typically only a small number of decision variables significantl…
PAINT–SiCon: constructing consistent parametric representations of Pareto sets in nonconvex multiobjective optimization
We introduce a novel approximation method for multiobjective optimization problems called PAINT–SiCon. The method can construct consistent parametric representations of Pareto sets, especially for nonconvex problems, by interpolating between nondominated solutions of a given sampling both in the decision and objective space. The proposed method is especially advantageous in computationally expensive cases, since the parametric representation of the Pareto set can be used as an inexpensive surrogate for the original problem during the decision making process. peerReviewed
Exact extension of the DIRECT algorithm to multiple objectives
The direct algorithm has been recognized as an efficient global optimization method which has few requirements of regularity and has proven to be globally convergent in general cases. direct has been an inspiration or has been used as a component for many multiobjective optimization algorithms. We propose an exact and as genuine as possible extension of the direct method for multiple objectives, providing a proof of global convergence (i.e., a guarantee that in an infinite time the algorithm becomes everywhere dense). We test the efficiency of the algorithm on a nonlinear and nonconvex vector function. peerReviewed
On Generalizing Lipschitz Global Methods forMultiobjective Optimization
Lipschitz global methods for single-objective optimization can represent the optimal solutions with desired accuracy. In this paper, we highlight some directions on how the Lipschitz global methods can be extended as faithfully as possible to multiobjective optimization problems. In particular, we present a multiobjective version of the Pijavskiǐ-Schubert algorithm.
On the Extension of the DIRECT Algorithm to Multiple Objectives
AbstractDeterministic global optimization algorithms like Piyavskii–Shubert, direct, ego and many more, have a recognized standing, for problems with many local optima. Although many single objective optimization algorithms have been extended to multiple objectives, completely deterministic algorithms for nonlinear problems with guarantees of convergence to global Pareto optimality are still missing. For instance, deterministic algorithms usually make use of some form of scalarization, which may lead to incomplete representations of the Pareto optimal set. Thus, all global Pareto optima may not be obtained, especially in nonconvex cases. On the other hand, algorithms attempting to produce r…