1 Answers
In mathematical optimization, the proximal operator is an operator associated with a proper, lower semi-continuous convex function f {\displaystyle f} from a Hilbert space X {\displaystyle {\mathcal {X}}} to {\displaystyle } , and is defined by:
For any function in this class, the minimizer of the right-hand side above is unique, hence making the proximal operator well-defined. The prox {\displaystyle {\text{prox}}} of a function enjoys several useful properties for optimization, enumerated below. Note that all of these items require f {\displaystyle f} to be proper , convex, and lower semi-continuous.
A function is said to be firmly non-expansive if ∈ X 2 ] ‖ prox f x − prox f y ‖ 2 ≤ ⟨ x − y | prox f x − prox f y ⟩ {\displaystyle \in {\mathcal {X}}^{2}]\quad \|{\text{prox}}_{f}x-{\text{prox}}_{f}y\|^{2}\leq \langle x-y\ |\ {\text{prox}}_{f}x-{\text{prox}}_{f}y\rangle }. Fixed points of prox f {\displaystyle {\text{prox}}_{f}} are minimizers of f {\displaystyle f} : { x ∈ X | prox f x = x } = arg min f {\displaystyle \{x\in {\mathcal {X}}\ |\ {\text{prox}}_{f}x=x\}=\arg \min f}.
Global convergence to a minimizer is defined as follows: If arg min f ≠ ∅ {\displaystyle \arg \min f\neq \varnothing } , then for any initial point x 0 ∈ X {\displaystyle x_{0}\in {\mathcal {X}}} , the recursion x n + 1 = prox f x n {\displaystyle \quad x_{n+1}={\text{prox}}_{f}x_{n}} yields convergence x n → x ∈ arg min f {\displaystyle x_{n}\to x\in \arg \min f} as n → + ∞ {\displaystyle n\to +\infty }. This convergence may be weak if X {\displaystyle {\mathcal {X}}} is infinite dimensional.