I am learning functional equation and encountered $f(f(x)-x)=x$.

First, I claim that $g(0)=0$. To prove this, note that since $g$ is injective and continuous, it is either increasing or decreasing. If it is increasing then $g(0)$ and $g(g(0))$ have the same sign so the only way they can add to $0$ is if $g(0)=0$. If $g$ is decreasing, then $g(0)$ and $g(g(0))$ have opposite sign and then by the intermediate value theorem there must be $x$ between $0$ and $g(0)$ such that $g(x)=x$. But then the functional equation says $2x=x$ so $x=0$ and thus $g(0)=0$.

Now consider the function $h(x)=g(x)/x$ on $\mathbb{R}\setminus\{0\}$. Since $g(g(x))+g(x)=x$, we have $$h(g(x))h(x)+h(x)=\frac{g(g(x))}{g(x)}\cdot\frac{g(x)}{x}+\frac{g(x)}{x}=1$$ and thus $$h(g(x))=T(h(x))$$ where $$T(x)=\frac{1-x}{x}.$$ Note moreover that since $g$ is monotone with $g(0)=0$, $h$ must always have the same sign.

Now suppose $g$ is increasing, so $h(x)>0$ everywhere. We then must have $T(h(x))>0$ as well and so $h(x)<1$. But then we must have $T(h(x))<1$ and so $h(x)>1/2$. Similarly we find that $h(x)<T^{-n}(1)$ for each even $n$ and $h(x)>T^{-n}(1)$ for each odd $n$. But the sequence $(T^{-n}(1))$ for even $n$ is decreasing and hence converges to a fixed point of $T^2$ and for odd $n$ it is increasing and so again converges to a fixed point of $T^2$. The unique positive fixed point of $T^2$ is $\alpha=\frac{\sqrt{5}-1}{2}$ and so we must have $h(x)=\alpha$ for all $x\neq 0$ and so $g(x)=\alpha x$ for all $x$.

(In more intuitive terms, what's going on here is that $\alpha$ is a repelling fixed point of $T$, and so if $x$ is not exactly equal to $\alpha$ then $T^n(x)$ will eventually get far away from $\alpha$. In particular, it turns out that $T^n(x)$ will always get far enough away to become negative, contradicting that $h$ needs to be positive.)

The case that $g$ is decreasing is a little more complicated since the negative fixed point $\beta=\frac{-\sqrt{5}-1}{2}$ of $T$ is attracting instead of repelling. The trick is that $\beta$ is a repelling fixed point of $T^{-1}$ and so we can use a similar argument with $T^{-1}$ as long as we first show that $g$ is surjective. To see that $g$ is surjective, note that since $h(x)$ is always negative, $h(g(x))<-1$ for all $x$, and so $|g(g(x))|>|g(x)|$ for all $x\neq 0$. Now the image of $g$ is some possibly unbounded interval $I$. Since $|g^3(x)|>|g^2(x)|>|g(x)|$ for all $x\neq 0$ and $g^3(x)$ has the same sign as $g(x)$, the image of $g^3$ must also be $I$ (since as $g(x)$ approaches the upper and lower bounds of $I$, so does $g^3(x)$). Since $g$ is injective, this means that $g^2$ is surjective (otherwise $g^3=g\circ g^2$ would have smaller image than $g$) and so $g$ is surjective.

Now we have $$h(x)=T(h(g^{-1}(x)))$$ for all nonzero $x$. Since $h(g^{-1}(x))$ is always negative, this implies $h(x)<-1$. But then $h(g^{-1}(x))<-1$ and so $h(x)>-2$. Similarly we find $h(x)<T^n(-1)$ for all even $n$ and $h(x)>T^n(-1)$ for all odd $n$. As in the previous case, these bounds converge to $\beta$, the unique negative fixed point of $T^2$, and so $h(x)=\beta$ for all $x\neq 0$ and $g(x)=\beta x$ for all $x$.


A few days ago I deleted my answer, which depended on the injectivity of $f$, which is not obvious at all, as Eric Wofsey remarked. An answer has been posted since (by him), but I'll post my variation of the previous answer anyway.

We want to prove that the two functions $f(x) = \lambda x$ where $\lambda$ is a solution to $\lambda^{2}-\lambda-1=0$ are the only two solutions. The idea is as follows:

  • Consider the graph $\Gamma = \left\{(x,f(x)\,|\,x\in\mathbb R)\right\}$ of $f$ in the plane. The functional equation translates into the existence of a map $T$ that preserves $\Gamma$.

  • Unless $f(x) = \lambda x$ with $\lambda$ as above, we can use $T$ to construct points $P_1 = (x_1, f(x_1))$ and $P_2 = (x_2, f(x_2))$ with $x_1, x_2$ of the same sign and $P_1, P_2$ on different sides of the line $x = y$.

  • It follows that there is a point with $x = f(x)$ with $x \ne 0$. However, from the functional equation we see that $0$ is the only possible fixed point.

Now for the details:

Let $(x,y)\in\mathbb R^2$ be a point on the graph $\Gamma$ of $x$ (so $f(x) = y$). Then $(y - x, x)$ also is on the graph of $f$. That means that the linear transformation with matrix

$$T = \begin{pmatrix}-1 & \phantom{-}1\\ \phantom{-}1 & \phantom{-}0\end{pmatrix}$$

maps the graph of $f$ onto itself. It is not hard to see that $T:\Gamma\to\Gamma$ is surjective, so $T^{-1}$ also preserves $\Gamma$.

$T$ is diagonalized by a reflection, and the eigenvalues are $-\phi$ and $\phi^{-1}$.

Let $\Gamma'$ be the graph in the new coordinates, which is a reflection of the original graph. Note that it doesn't have to be the graph of a function anymore. We want to prove that $\Gamma'$ has to coincide with one of the coordinate axes.

  • Assume that there exists a point $P\in\Gamma'$ outside the (new) axes. Then $T^{2n}P \to (\pm\infty,0)$ and $T^{2n + 1}P \to (\mp\infty,0)$ as $n\to\infty$, since $\phi > 1$ and $\phi^{-1} < 1$.

  • This shows that for every $x$ there is a point $(x,y)\in\Gamma'$ for at least one $y\in\mathbb R$. Let $y$ be such that $(0,y)\in\Gamma'$, then also $(0,\phi^{-n}y)\in\Gamma'$, so $(0,0)\in\Gamma'$ by continuity.

  • Also $T^{-n}P\to (0,\pm\infty)$ where the sign is equal to the sign of the $x$ coordinate of $P$.

We are now in the situation of the following picture, where the red dots lie on the graph, the blue lines are the eigenspaces of $T$. The dots other than the origin may be very far, but they can be arbitrarily close to the blue lines.

points on the graph

Note that $T^{-n}P$ could also be between the negative $y$-axis and the line $y = x$, depending on the location of $P$. In any case it is clear that the graph of $f$ must cross the line $y = x$ at some $x \ne 0$, i.e. $f(\xi) = \xi$ for $\xi\ne 0$. But

$$\xi = f(f(\xi) - \xi) = f(0) = 0$$

showing that such a point $P$ couldn't exist.