Is it necessary that if $\lim_{x \to \infty } f(x) + f'(x) = L$ , where $L$ is finite then $\lim_{x\to\infty}f(x)= L$ and $\lim_{x\to\infty}f'(x)=0$

Consider $r(x) := f'(x) + f(x)$ where we will view this as being a fixed function, and $f'(x) + f(x) = r(x)$ as giving a differential equation. The solution to this ODE is $f(x) = e^{-x} \int e^x r(x) \, dx$.

Now, given that $r(x) \to L$ as $x \to \infty$, we have that $\int e^x r(x)\,dx = e^x L + o(e^x)$, so $f(x) = L + o(1)$. Therefore, $\lim_{x\to \infty} f(x) = L$, and $\lim_{x\to \infty} f'(x) = \lim_{x\to \infty} (r(x) - f(x)) = L - L = 0$.

(To see how to prove the claim that $r(x) \to L$ as $x \to \infty$ implies $\int e^x r(x)\,dx = e^x L + o(e^x)$, it is fairly easy to reduce to the case $L = 0$ by using $r(x) - L$ in place of $r(x)$. Now, for each $\epsilon > 0$, suppose that $|r(x)| < \epsilon$ whenever $x > R$; then $\int e^x r(x) \,dx = C + \int_R^x e^t r(t)\,dt$, and $$\left|\int_R^x e^t r(t)\,dt\right| \le \int_R^x e^t |r(t)|\,dt \le \int_R^x e^t \cdot \epsilon \, dt = \epsilon(e^x - e^R).$$ From here, it should be straightforward to see that $\left|\int e^x r(x)\,dx\right| < 2\epsilon e^x$ for $x$ sufficiently large; and since this is true for any $\epsilon > 0$, we can conclude the desired result that $\int e^x r(x)\,dx = o(e^x)$.)


First, replacing $f(x)$ by $f(x)-L$, we can reduce to the case that $L=0$. So we're given $f(x)+f'(x)\to0$ as $x\to\infty$, and the objective is to prove that $f(x)\to0$ as $x\to\infty$. It suffices to show that, given any $\varepsilon>0$, we have $f(x)<\varepsilon$ for all sufficiently large $x$. Indeed, if we can do this, then the same argument applied to $-f$ gives $f(x)>-\varepsilon$ and therefore $|f(x)|<\varepsilon$ for all sufficiently large $x$.

So let an arbitrary $\varepsilon>0$ be given. By assumption, we can fix some number $A$ such that $f(x)+f'(x)<\varepsilon/3$ for all $x>A$. Of course, if $f(x)<\varepsilon$ for all $x>A$, then we have what we need.

So suppose that we have some $B>A$ with $f(B)\geq\varepsilon$. Since we have, by our choice of $A$, $f(B)+f'(B)<\varepsilon/3$, it follows that $f'(B)<-2\varepsilon/3$. Similarly, if we consider any $C>B$ such that $f(x)\geq2\varepsilon/3$ throughout the interval $[B,C]$, then throughout this interval we also have $f'(x)<-\varepsilon/3$. By the mean value theorem, there is a point $y\in[B,C]$ such that $$ \frac{fC)-f(B)}{C-B}=f'(y)<\frac{-\varepsilon}3. $$ A bit of algebraic manipulation (which I might have done correctly) converts this (plus the information that $f(C)\geq2\varepsilon/3$) to $$ C<B+\frac3\varepsilon f(B)-2. $$ Even if my algebra wasn't perfect, we get some upper bound on $C$. That is, $f$ cannot remain $\geq2\varepsilon/3$ from $B$ all the way out to $\infty$.

For the rest of the proof, fix $C$ as the first point after $B$ where $f$ takes a value $\leq2\varepsilon/3$. (There is a first such point because $f$, being differentiable,is continuous.) Continuity of $f$ also gives us that $f(C)=2\varepsilon/3$. To complete the proof, it suffices to show that $f(x)<\varepsilon$ for all $x>C$. So suppose not, and let $D$ be the first counterexample. Again, continuity of $f$ implies that there is a first such $D$ and that $f(D)=\varepsilon$.

For all $x\in[C,D]$, we have, by our choice of $D$, that $f(x)<\varepsilon=f(D)$. Therefore, for $x\in[C,D)$, $$ \frac{f(D)-f(x)}{D-x}>0 $$ and, since $f$ is differentiable at $D$, $f'(D)\geq0$. But then $f(D)+f'(D)\geq f(D)=\varepsilon$, contrary to the fact that, since $D>A$, our choice of $A$ ensures $f(D)+f'(D)<\varepsilon/3$.

Therefore, no such $D$ can exist, and the proof is complete.