expected value of squared infinity norm of vector of iid gaussians

Let $Z_i:=x_i$ and $M:=M_n:=\|x\|_\infty=\max_1^n|Z_i|$. By rescaling, without loss of generality $\sigma=1$. So, for real $u>0$ \begin{multline} P(M^2>u)=P(M>\sqrt u)=1-P(\max_1^n|Z_i|\le\sqrt u)=1-P(|Z_1|\le\sqrt u)^n \\ =1-(1-2G(\sqrt u))^n=1-e^{-ng(u)}, \tag{1} \end{multline} where $$G(x):=P(Z_1>x)\sim\frac1{x\sqrt{2\pi}}e^{-x^2/2} $$ as $x\to\infty$ and $$g(u):=-\ln(1-2G(\sqrt u))\sim2G(\sqrt u)\sim\frac2{\sqrt{2\pi u}}e^{-u/2} =e^{-u/(2+o(1))} $$ as $u\to\infty$.

Also, $g(u)$ decreases from $\infty$ to $0$ as $u$ increases from $0$ to $\infty$. So, for each natural $n\ge3$ there are unique positive real numbers $u_n$ and $v_n$ such that $$ng(u_n)=\ln n,\quad ng(v_n)=1. $$ Clearly, $0<u_n<v_n<\infty$. Also, $$\frac{\ln n}n=g(u_n)=e^{-u_n/(2+o(1))}\quad\text{and}\quad \frac{\ln n}n=e^{-(1+o(1))\ln n}, $$ whence $$u_n\sim2\ln n\quad\text{and, similarly,}\quad v_n\sim2\ln n. $$

Next, \begin{equation} EM^2=\int_0^\infty P(M^2>u)\,du=\int_0^\infty (1-e^{-ng(u)})\,du=I_1+I_2+I_3, \tag{2} \end{equation} where $$I_1:=\int_0^{u_n}(1-e^{-ng(u)})\,du,\quad I_2:=\int_{u_n}^{v_n}(1-e^{-ng(u)})\,du,\quad I_3:=\int_{v_n}^\infty (1-e^{-ng(u)})\,du. $$ If $0<u<u_n$, then $0<e^{-ng(u)}<e^{-ng(u_n)}=1/n$. So, $$I_1\sim u_n.$$ Next, $$I_2\le v_n-u_n=o(u_n), $$ $$I_3<\int_{v_n}^\infty ng(u)\,du\sim \int_{v_n}^\infty n\frac2{\sqrt{2\pi u}}e^{-u/2}\,du \sim 2n\frac2{\sqrt{2\pi v_n}}e^{-v_n/2} \sim 2ng(v_n)=2=o(u_n). $$ We conclude that, for $\sigma=1$, $$E\|x\|_\infty^2=EM^2\sim u_n\sim2\ln n. $$ So, for any real $\sigma>0$, $$E\|x\|_\infty^2\sim2\sigma^2\ln n. $$


Along the same lines, one can give an explicit non-asymptotic lower bound on $E\|x\|_\infty^2$ which will be asymptotically equivalent to $E\|x\|_\infty^2$ as $n\to\infty$. Indeed, one can use the inequality $$G(t)\ge B(t):=\frac{f(t)}{\sqrt{t^2+2}} $$ for real $t\ge0$, where $f$ is the standard normal pdf, so that $f(t)=\frac1{\sqrt{2\pi}}\,e^{-t^2/2}$ for real $t$. The latter lower bound on $G(t)$ is a simpler, even if a bit less accurate, version of Birnbaum's lower bound $B_1(t):=f(t)(\sqrt{t^2+4}-t)/2$ on $G(t)$; we have $B_1(t)>B(t)$ for all real $t\ge0$. So, assuming that $\sigma=1$ and $n\ge16$, letting \begin{equation*} w_n:=2\ln n-2\ln\ln n, \end{equation*} and recalling (1), for $u\in[0,w_n]$ we have $\ln\ln n>1$ and \begin{align*} P(M^2>u)&=1-(1-2G(\sqrt u))^n \\ &\ge1-\exp\big\{-2nG(\sqrt u)\big\} \\ &\ge1-\exp\big\{-2nG(\sqrt w_n)\big\} \\ &\ge1-\exp\Big\{-2n\frac{f(\sqrt w_n)}{\sqrt{w_n+2}}\Big\} \\ &=1-\exp\Big\{-\frac1{\sqrt\pi}\frac{\ln n}{\sqrt{1+\ln n-\ln\ln n}}\Big\} \\ &\ge1-\delta_n, \end{align*} where \begin{equation*} \delta_n:=\exp\Big\{-\sqrt{\frac{\ln n}\pi}\,\Big\}\to0. \end{equation*} Now it follows from (2) that, for $\sigma=1$, \begin{equation*} E\|x\|_\infty^2=EM^2\ge\int_0^{w_n} P(M^2>u)\,du \ge(1-\delta_n)w_n=(1-\delta_n)(1-\epsilon_n)2\ln n, \end{equation*} where \begin{equation*} \epsilon_n:=\frac{\ln\ln n}{\ln n}\to0. \end{equation*} So, for any real $\sigma>0$ and any $n\ge16$, $$E\|x\|_\infty^2\ge2(1-\delta_n)(1-\epsilon_n)\sigma^2\ln n\sim2\sigma^2\ln n, $$ as claimed.

For $n=1,\dots,15$, the values of $E\|x\|_\infty^2$ can be easily computed numerically, using (2) and (1), with any degree of accuracy, say to get these $15$ approximate values for $E\|x\|_\infty^2/\sigma^2$: $1, 1.63662, 2.10266, 2.47021, 2.77375, 3.03236, 3.25771, 3.45743, 3.6368, 3.79962, 3.9487, 4.08621, 4.21382, 4.33288, 4.44447$.


It is known that the max of i.i.d. subgaussian random variables with variance $\sigma^2$ is on the order of $\sigma \sqrt{\log n}$ so you can expect the squared max of the random variables to be roughly $\sigma^2 \log n$. A reference for this result is 'High Dimensional Probability' by Vershynin. In this case, Jensen's inequality immediately gives you that $$\mathbf{E}[||x||_{\infty}^2] \ge (\mathbf{E}[||x||_{\infty}])^2 = \Omega(\sigma^2 \log n).$$

We can also show that this is the right order of magnitude. Consider an arbitrary $\lambda > 0$ ($\lambda$ will have to satisfy a condition that we will address later). Then

\begin{align*} \exp(\lambda \mathbf{E}(\max_i |x_i|)^2) &\le \mathbf{E}\exp(\lambda (\max_i |x_i|)^2) \\ &= \mathbf{E} \max_i \exp(\lambda |x_i|^2) \\ &\le \sum_{i=1}^n \mathbf{E}\exp(\lambda x_i^2). \end{align*} Now the last quantity is just the MGF of a chi-squared distribution which has an explicit form which is $1/(\sqrt{1-2\lambda \sigma^2})$. Then by taking the logs we have $$ \mathbf{E}(\max_i |x_i|)^2 \le \frac{\log n}{\lambda} + \frac{1}{\lambda \sqrt{1-2\lambda \sigma^2}}. $$ Optimizing this quantity in $\lambda$, we let $\lambda$ be such that $$\lambda \sigma^2 = \frac{1}2 - \frac{1}{2 \log(n)^2}.$$ (Note in the conference of the MGF above, we needed $\lambda \sigma^2 < 1/2$ which is satisfied here). Then plugging back in, we see that $$ \mathbf{E}(\max_i |x_i|)^2 \le \frac{4 \sigma^2 \log(n)}{1-1/\log(n)^2}$$ so $\sigma^2 \log n$ is the right order.