Proximal Operator of $ f \left( x \right) = {\left\| A x \right\|}_{2} $ Where $ A $ Is Diagonal Matrix (Weighted $ {L}_{2} $ Norm)

We have $$\mathrm{prox}_{f}(x) = \operatorname{argmin}_{u\in \mathbb{R}^n} \left\{\|Au\|_2 + \tfrac{1}{2}\|u - x\|_2^2\right\}. \tag{1}$$ Let $u^\ast$ be the solution of the optimization problem in (1). Note that if $u\ne 0$, then $\|Au\|_2$ is differentiable, and $\nabla \|Au\|_2 = \frac{A^TAu }{\|Au\|_2}$. Thus, if $u^\ast \ne 0$, then the gradient of the objective function at $u^\ast$ vanishes, i.e. $$\frac{A^TAu^\ast }{\|Au^\ast \|_2} + u^\ast - x = 0$$ which results in $$u^\ast = \Big(\frac{A^TA }{\|Au^\ast \|_2} + I_n\Big)^{-1} x. \tag{2}$$ Also, if there does not exist $u^\ast \ne 0$ satisfying (2), then $u^\ast = 0$.

Let us solve (2). Let $A = \mathrm{diag}(a_1, a_2, \cdots, a_n)$. Let $\lambda = \|Au^\ast \|_2 > 0$. From (2), we have $$u^\ast_i = \frac{\lambda}{a_i^2 + \lambda} x_i, \ \forall i. \tag{3}$$ From (3) and $\lambda^2 = \|Au^\ast \|_2^2 > 0$, we have $$\sum_{i=1}^n \frac{a_i^2x_i^2}{(a_i^2 + \lambda)^2} = 1. \tag{4}$$ Let $$F(\lambda) = \sum_{i=1}^n \frac{a_i^2x_i^2}{(a_i^2 + \lambda)^2}.$$ Note that $F(\infty) = 0$ and $F(0) = \sum_{i=1}^n \frac{x_i^2}{a_i^2} = \|A^{-1}x\|_2^2$. Also, $F(\lambda)$ is strictly decreasing on $[0, \infty)$. Thus, $F(\lambda) = 1$ has a unique positive solution if and only if $F(0) > 1$ i.e. $\|A^{-1}x\|_2 > 1$. As a result, we have $$\mathrm{prox}_{f}(x) = \left\{\begin{array}{cc} 0 & \|A^{-1}x\|_2 \le 1 \\[6pt] (\frac{1}{\lambda}A^TA + I_n)^{-1} x & \|A^{-1}x\|_2 > 1 \end{array} \right. \tag{5}$$ where $\lambda$ is the unique positive solution of (4).

Remark 1: In general, $\lambda$ cannot be expressed in closed form (see Remark 2). If $A = I_n$ (or $A = \alpha I_n$), $\lambda$ can be expressed in closed form. For example, $A = I_n$, then (4) becomes $\frac{\|x\|_2^2}{(1+\lambda)^2} = 1$ and hence $\lambda = \|x\|_2 - 1$. Thus, from (5), we have \begin{align} \mathrm{prox}_{\|x\|_2}(x) &= \left\{\begin{array}{cc} 0 & \|x\|_2 \le 1 \\[6pt] (1 - \frac{1}{\|x\|_2})x & \|x\|_2 > 1 \end{array} \right.\\ &= \Big(1 - \frac{1}{\max(\|x\|_2, 1)}\Big)x. \end{align} This is a well-known result.

Remark 2: Numerical results verify our result (5). We use CVX+Matlab to solve the optimization problem in (1).

Let us see an example. Let $x = [-3, 2, -1, 1]^T$ and $A = \mathrm{diag}(1, 2, 3, 4)$. Equation (4) becomes $$\frac{9}{(\lambda+1)^2} + \frac{16}{(\lambda+4)^2} + \frac{9}{(\lambda+9)^2} + \frac{16}{(\lambda+16)^2} = 1,$$ which, after clearing the denominator, results in \begin{align} &\lambda^8+60\lambda^7+1396\lambda^6+15840\lambda^5+86511\lambda^4\\ & +143320\lambda^3-568624\lambda^2-2517120\lambda-3043584 = 0. \end{align} In general, there is no closed form solution to an equation of 8th degree. Using Maple, we have $\lambda = 2.989390606...$. We use CVX+Matlab to solve the optimization problem in (1), which is compared with (5). We have $$\mathrm{prox}_{f}([-3, 2, -1, 1]^T) \approx [-2.2480, 0.8554, -0.2493, 0.1574]^T.$$


I made a mistake in my calculations, and have edited it.

According to the Moreau decomposition, for $t > 0$ \begin{equation} x = \operatorname{prox}_{t f}(x) + t \operatorname{prox}_{t^{-1} f^{*}}(x / t). \end{equation} Now we first compute the conjugate of $f(x)=\|Ax\|$. \begin{equation} \begin{aligned} f^*(u) &= \sup_{x}~ \langle u,x \rangle - f(x) \\ &= \left\{ \begin{array}{cl} {0} & {\text{if}~\|A^{-1}u\| \leq 1,} \\ {+\infty} & {\text{otherwise.}} \end{array} \right. \end{aligned} \end{equation} And then \begin{equation} \begin{aligned} \operatorname{prox}_{t^{-1}f^*}(x/t) &= \arg\min_{u} f^*(u) + \frac{t}{2}\|u-\frac{x}{t}\|^2 \\ &= \arg\min_{\|A^{-1}u\| \leq 1} \frac{t}{2}\|u-\frac{x}{t}\|^2 \\ &= \left\{ \begin{array}{cl} {\frac{x}{t}} & {\text{if}~\|A^{-1}x\| \leq t ,} \\ {\frac{A^TAx}{tA^TA + \lambda I}} & {\text{otherwise.}} \end{array} \right. \end{aligned} \end{equation} where $\lambda$ is the solution of \begin{equation} \|\frac{Ax}{tA^TA+ \lambda I}\| = 1. \end{equation} Finally, we have \begin{equation} \begin{aligned} \operatorname{prox}_{tf}(x) &= x-t \operatorname{prox}_{t^{-1}f^*}(x/t) \\ &= \left\{ \begin{array}{cl} {0} & {\text{if}~\|A^{-1}x\| \leq t ,} \\ {(I - \frac{A^TA}{A^TA + \frac{\lambda}{t}I})x} & {\text{otherwise.}} \end{array} \right. \end{aligned} \end{equation} (If $A=I$ and $f$ will be reduced to Euclidean norm.)

Since $A$ is a diagonal matrix with all positive eigenvalues , the Fenchel Conjugate of $f(x)$ can be computed by the following: \begin{equation} \begin{aligned} f^*(u) &= \sup_{x} \langle u, x \rangle - \|Ax\| \\ &=\sup_{y=Ax} \langle A^{-1}u, y \rangle - \|y\| = g^*(A^{-1}u), \end{aligned} \end{equation} where $g$ is Euclidean norm $g(x)=\|x\|$. Since the conjugate of $g(x)$ is \begin{equation} g^*(x^*)= \left\{ \begin{array}{cl} {0} & {\text{if}~\|x\| \leq 1,} \\ {+\infty} & {\text{otherwise.}} \end{array} \right. \end{equation} Thus \begin{equation} f^*(u) = g^*(A^{-1}u)= \left\{ \begin{array}{cl} {0} & {\text{if}~\|A^{-1}u\| \leq 1,} \\ {+\infty} & {\text{otherwise.}} \end{array} \right. \end{equation}

For the computation of $\lambda$, one can refer to @River Li's answer, or solve it easily through Lagrangian multiplier method.