How to prove the sum of squares is minimum?

HINT: By Cauchy-Schwarz we know

$$\left(\sum x_i y_i\right)^2 \leq \left(\sum x_i^2\right) \left(\sum y_i^2\right)$$

Take $y_i = 1$ for all $i$ to get a lower bound on $\sum x_i^2$. Then show that $x_i = \frac{k}{n}$ achieves this bound.


Let $c = k/n$. Then, for all $(x_1,\ldots,x_n)$ such that $\sum_i x_i = k$, $$ \newcommand{\s}{\sum_{i=1}^n} \s x_i^2 = \s (c + x_i - c)^2 = c^2 n + \s (x_i - c)^2 \>, $$ since $2 \s c(x_i-c) = 0$. The right-hand side is obviously minimized by taking $x_i = c$ for all $i$ and so the result follows.


I think this reeks of AM-QM inequality. The $x_i$ have a fixed arithmatic mean of $\frac{k}{n}$, while the quadratic mean: $$ \sqrt{\frac{x_1^2 + \cdots + x_n^2}{n}} $$ is bounded below by that same number, which means that the sum of squares is bounded below by $\frac{k^2}{n}$, attained exactly when the $x_i$ are all equal.

Tags:

Optimization