Convex optimization with inequality constraints [closed]
$f(x):\mathbb{R}^n\to\mathbb{R} $ is a convex function, and $x^* = argmin\,\, f(x)$. Then $max\{x^*,0\}$ must be the optimal solution of problem \begin{equation} min\,\, f(x), subject\,\,to\,\, x\geq0. \end{equation} Why? How to prove that?
Solution 1:
If $x^* \ge 0$, great, you're done. Otherwise ($x^* < 0$ is infeasible), the bound constraint must be active ($x = 0$).
Intuitively, if you start from some initial $x^{(0)}$ and follow some direction in which $f$ decreases, you'll end up either at the feasible unconstrained minimum, or you'll hit (and stay at) the boundary $x = 0$ of the feasible domain.
Let's sum up: the constrained minimum is $x^* $ if $x^* \ge 0$, $0$ otherwise. So it's $\max(x^*, 0)$.