go back

A Systematic Approach to Analyze the Computational Cost of Robustness in Model-Assisted Robust Optimization

Sibghat Ullah, Hao Wang, Stefan Menzel, Bernhard Sendhoff, Thomas Bäck, "A Systematic Approach to Analyze the Computational Cost of Robustness in Model-Assisted Robust Optimization", International Conference on Parallel Problem Solving from Nature (PPSN), 2022.

Abstract

Real-world optimization scenarios under uncertainty and noise are typically handled with robust optimization techniques, which reformulate the original optimization problem into a robust counterpart, e.g., by taking an average of the function values over different perturbations to a specific input. Solving the robust counterpart instead of the original problem can significantly increase the associated computational cost, which is often overlooked in the literature to the best of our knowledge. Such an extra cost brought by robust optimization might depend on the problem landscape, the dimensionality, the severity of the uncertainty, and the formulation of the robust counterpart. This paper targets an empirical approach that evaluates and compares the computational cost brought by different robustness formulations in Kriging-based optimization on a wide combination (300 test cases) of problems, uncertainty levels, and dimensions. We mainly focus on the CPU time taken to find the robust solutions, and choose five commonly applied robustness formulations: “mini-max robustness”, “mini-max regret robustness”, “expectation-based robustness”, “dispersion-based robustness”, and “composite robustness” respectively. We assess the empirical performance of these robustness formulations in terms of a fixed budget and a fixed target analysis, from which we find that “mini-max robustness” is the most practical formulation w.r.t. the associated computational cost.



Download Bibtex file Download PDF

Search