How do you differentiate the likelihood function for the uniform distribution in finding the M.L.E.?

The likelihood function can be written as $$ L(\theta)=\frac{1}{\theta^n}\mathbf{1}_{\theta\geq c}, $$ where $c=\max\{x_1,\ldots,x_n\}$. Therefore, $\theta\mapsto L(\theta)$ is not differentiable on the whole of $(0,\infty)$ and hence we cannot solve $L'(\theta)=0$ to look for maxima and minima. (Maxima and minima of a function $f$ have to be found among values of $x$ with either $f'(x)=0$ or $f'(x)$ being undefined)

Note however, that $L$ is differentiable on $(0,\infty)\setminus\{c\}$ and that $L(\theta)=0$ for $\theta\in (0,c)$ and by looking at $L'(\theta)$ on $(c,\infty)$ we see that $L$ is decreasing on $(c,\infty)$. Since $$ L(c)=\frac{1}{c^n}>\frac{1}{\theta^n}=L(\theta),\quad \text{for all }\;\theta>c $$ we see that $L(c)$ is the global maximum.


In addition to Stefan Hansen's great answer (+1), intuitively just think of the following:

  1. As you say $L(\theta)$ is a decreasing function so to maximize it $\theta$, $\theta$ has to be such that it is as low as possible

  2. Secondly, given your restriction on the observations (the random variables $x_i$'s), can it be smaller than $max(X_i)$? The answer is no

1 and 2 imply that even though you might obtain a value of $\theta$ by differentiating so as to obtain a larger likelihood, respecting the second restriction implies that this value is not reasonable.