Proximal Operator of the Euclidean Norm ($ {L}_{2} $ Norm) without Using Moreau Decomposition

$\renewcommand{\Re}{\mathbb{R}}$ We need to solver the optimization problem $$ \mathrm{prox}_{\lambda\|{}\cdot{}\|}(v) = \mathrm{argmin}_{x\in\Re^n}\|x\| + \tfrac{1}{2\lambda}\|x-v\|^2. $$ For convenience and wlog, let $\lambda=1$. I assume that $x\neq 0$. The optimality condition we need to solve is \begin{align} \frac{x}{\|x\|} = v-x.\tag{1}\label{eq:1} \end{align} From \eqref{eq:1} we see that $x$ is parallel to $v$ (again, provided that $x\neq 0$). Indeed, $v=(1+1/\|x\|)x$. Let us assume that $x(v)$ has the parametric form $x(v) = \sigma(v)\cdot v$, where $\sigma:\Re^n\to\Re$. Substituting into \eqref{eq:1}, \begin{align} \frac{\sigma v}{|\sigma|\|v\|} = v-\sigma v,\tag{2} \end{align} From which we conclude that $\sigma(v) \cdot v$ may be either of the following candidates \begin{align} x(v) = \sigma v = \left(1 \pm \tfrac{1}{\|v\|} \right)v.\tag{3}\label{eq:3} \end{align} We plug in \eqref{eq:3} into \eqref{eq:1} and check whether it solves the optimality conditions (recall that there is a unique solution). We may verify that the following is indeed a solution \begin{align} x(v) = \left(1 - \tfrac{1}{\|v\|} \right)v,\tag{4}\label{eq:4} \end{align} but provided that $1 - \tfrac{1}{\|v\|}\geq 0$, that is $\|v\|\geq 1$.


This is just my approach to solve the question, but I have accepted Pantelis's answer.

Starting from the beginning:

$prox_{\lambda||\cdot||}(y) = argmin_x ||x|| + \frac{1}{2\lambda}||x-y||^2$.

Setting gradient to zero:

$\lambda\frac{x}{||x||} + (x-y) = 0$

$(\frac{\lambda}{||x||} + 1)x = y$

Take the norm of both sides:

$(\frac{\lambda}{||x||} + 1)||x|| = ||y||$

Treat $||x||$ as z:

$\lambda + z = ||y||$. $z = ||y|| - \lambda$.

We now know $||x|| = ||y|| - \lambda$, so plug back into the equation:

$(\frac{\lambda}{||y|| - \lambda} + 1)x = y$ and solve for the equation accordingly. It should give the same answer as what Pantelis said.