Differentiating an integral that grows like log asymptotically

The answer is no, even in the smooth case. Take for example:

$$ f(x) = \frac{2}{x} + \frac{\cos(\log(x))}{x} $$

Alter it on a small neighborhood of $0$ in such a way that there is no singularity there, preserving smoothness (this will be irrelevant for the asymptotics). This function is decreasing and, for $t$ sufficiently large, we have

$$ \int_0^t f(x) dx = C + 2\log(t) + \sin(\log(t)) = 2\log(t) + o(\log(t)) $$

The monotone density theorem mentioned in the comments does not work in general if your r.h.s. is simply a slowly varying function (as any function asymptotic to $\log(t)$). You want your r.h.s. to be a de Haan function. The specific result you may want to use is Theorem 3.6.8 here:

Bingham, N. H.; Goldie, C. M.; Teugels, J. L., Regular variation., Encyclopedia of Mathematics and its Applications, 27. Cambridge etc.: Cambridge University Press. 512 p. £ 20.00/pbk (1989). ZBL0667.26003.


The post by Raziel shows that the answer to the original question is no. The OP then asked, in a comment to that post, if one one still conclude that $f(t)\asymp\frac1t$ (as $\to\infty$); as usual, $a\asymp b$ means here that $\limsup|\frac ab+\frac ba|<\infty$.

Let us show that the answer is still no. E.g., for $j=0,1,\dots$ let $t_j:=e^{j^2}$, \begin{equation} c_j:=\frac{\ln t_{j+1}-\ln t_j}{t_{j+1}-t_j}\sim\frac{2j}{t_{j+1}} \tag{1} \end{equation} (as $j\to\infty$), and \begin{equation} f(x):=c_j\quad\text{for}\quad x\in[t_j,t_{j+1}), \end{equation} with $f:=c_0=\frac1{e-1}$ on $[0,t_0)$. Let also $F(t):=\int_0^t f(x)\,dx$.

Then $f$ is nonincreasing, $0<f\le1$, $F(t_j)=c_0+\ln t_j\sim c_0+\ln t_{j+1}=F(t_{j+1})$, whence $F(t)\sim\ln t$ (as $t\to\infty$), whereas $f(t_{j+1}-)=c_j$ is much greater than $\frac1{t_{j+1}}$, by (1). We also see that $f(t_j)=c_j$ is much less than $\frac1{t_j}$, again by (1).

The only condition missed here is the continuity of $f$, as $f$ is not left-continuous at $t_{j+1}$ for $j=0,1,\dots$. This omission is quite easy, but tedious, to fix by approximation. For instance, one can replace the above $f$ on every interval $[t_{j+1}-c_02^{-j},t_{j+1}]$ by the linear interpolation of $f$ on the same interval. Then instead of the value $c_0+\ln t_{j+1}$ of $F(t_{j+1})$ we will have $b_j+\ln t_{j+1}\sim c_0+\ln t_j=F(t_j)$ for some $b_j\in[0,c_0]$, and instead of $f(t_{j+1}-)=c_j$ being much greater than $\frac1{t_{j+1}}$, we will have that $f(t_{j+1}-c_0)=c_j$ is much greater than $\frac1{t_{j+1}-c_0}$.