Global Optimization Using Maching Learning


Many approaches for addressing Global Optimization problems typically rely on relaxations of nonlinear constraints over specific mathematical primitives. This is restricting in applications with constraints that are black-box, implicit or consist of more general primitives. Trying to address such limitations, Bertsimas and Ozturk (2022) proposed OCTHaGOn as a way of solving black-box global optimization problems by approximating the nonlinear constraints using hyperplane-based Decision-Trees and then using those trees to construct a unified MIO approximation of the original problem. We provide significant extensions to this approach, by (i) approximating the original problem using a much richer family of MIO-representable ML models besides Decision Trees, (ii) proposing adaptive sampling procedures for more accurate ML-based constraint approximations, (iii) utilizing robust optimization to account for the uncertainty of the sample-dependent training of the ML models and (iv) leveraging a family of relaxations to address the infeasibilities of the final MIO approximation. We show the improvements resulting from those enhancements through a wide range of Global Optimization benchmarks. We demonstrate the promise of the enhanced approach in finding globally optimal solutions, and compare it with well-established global optimizers such as BARON.

May 29, 2023 12:00 AM
SIAM Conference on Optimization (OPT23)
Seattle, Washington