$f_n$ converge uniformly to $f$ then $\mathrm{d}f_n(x_n)$ converges to $\mathrm{d}f(x)$

We first prove the result in the case where there is a local maximum. At a maximum, the differential vanishes, so the claim is here:

Lemma. If $(g_n)_n$ is a sequence of $C^1$ functions $\mathbb{R}^p\to \mathbb{R}$ converging uniformly to a $C^1$ function $g$ having a local (strict) maximum at $y$, that is $g(x)<g(y)$ for all $x\neq y$ in a ball neigborhood $B(y,r)$ of $y$, then there exist a sequence $x_n$ such that $\lim x_n=y$ and $dg_n(x_n)=0$ for sufficiently large $n$.

Proof of the Lemma: Pick $N$ sufficiently large so that $\forall n\geq N$, $$sup_{ \|x-y\|=r} g_n(x)<g_n(y).$$ The existence of such an $N$ follows from the fact the corresponding inequality is true for $g$ by hypothesis, and the uniform convergence of $g_n$. For any $n\geq N$, pick $x_n$ to be a maximum of $g_n$ on $B(y,r)$. Because of the previous inequality, $x_n$ is in the interior of the ball $B(y,r)$. So the derivative satisfies $dg_n(x_n)=0$.

Let $x$ be a limit point of a subsequence of $(x_n)$. Since $g_n(x_n)\geq g_n(y)$ by definition of $x_n$, taking the limit we have $g(x)\geq g(y)$, and of course $x_n$ is still in the closed ball $B(y,r)$. So necessarily $x=y$ since $y$ is a local strict maximum of $g$ on $B(y,r)$. This concludes the proof of the Lemma.

Now, to deal with the general case, pick a point $y$ and define $$g_n(x)=f_n(x)-f(x)-\|x-y\|^2.$$ Clearly, this sequence of $C^1$ functions converges to $g(x)=-\|x-y\|^2.$ We now apply the Lemma, so there is a sequence $(x_n)_n$ such that $\lim x_n=y$, and $dg_n(x_n)=0$. But since $$dg_n(x_n).h=df_n(x_n).h-df(x_n).h-2\langle x_n-y, h \rangle,$$ so $$df_n(x_n)=df(x_n)+ 2\langle x_n-y, . \rangle.$$ The linear forms $h\mapsto 2\langle x_n-y, h \rangle$ converges to zero because of Cauchy-Scharwz inequality, and $df(x_n)$ converges to $df(y)$ since $f$ is $C^1$. This concludes the proof.