1 Answers
Global optimization is a branch of applied mathematics and numerical analysis that attempts to find the global minima or maxima of a function or a set of functions on a given set. It is usually described as a minimization problem because the maximization of the real-valued function g {\displaystyle g} is equivalent to the minimization of the function f := ⋅ g {\displaystyle f:=\cdot g}.
Given a possibly nonlinear and non-convex continuous function f : Ω ⊂ R n → R {\displaystyle f:\Omega \subset \mathbb {R} ^{n}\to \mathbb {R} } with the global minima f ∗ {\displaystyle f^{*}} and the set of all global minimizers X ∗ {\displaystyle X^{*}} in Ω {\displaystyle \Omega } , the standard minimization problem can be given as
that is, finding f ∗ {\displaystyle f^{*}} and a global minimizer in X ∗ {\displaystyle X^{*}} ; where Ω {\displaystyle \Omega } is a compact set defined by inequalities g i ⩾ 0 , i = 1 , … , r {\displaystyle g_{i}\geqslant 0,i=1,\ldots ,r}.
Global optimization is distinguished from local optimization by its focus on finding the minimum or maximum over the given set, as opposed to finding local minima or maxima. Finding an arbitrary local minimum is relatively straightforward by using classical local optimization methods. Finding the global minimum of a function is far more difficult: analytical methods are frequently not applicable, and the use of numerical solution strategies often leads to very hard challenges.