What inequalities should one know to evaluate limits fluently?

$-1\leq \sin(x)\leq 1$, and the same is true for $\cos$

The AM-GM inequality is so popular it has its own tag on this site.

This inequality involving the choose function is pretty commonly used when the choose function or exponential function pop up $$\frac{n^k}{k^k}\leq {n\choose k}\leq\frac{n^k}{k!}\leq \frac{(n\cdot e)^k}{k^k}$$


A good thing to have in mind is the hierarchy of common functions. In synthetic form

For all $a∈ℝ$, $b⩾1$, $c>1$, $d>1$ and $x>1$

$$ a ≺ \log(\log(n)) ≺ \log(n)^b ≺ n^{\frac{1}{c}} ≺ n ≺ n \log(n) ≺ n^d ≺ x^n ≺ n!≺ n^n $$

when $n⟶+∞$, using Hardy's notations for asymptotic domination¹.

Note that this does not give you the specific $N$ where one gets superior to the other, though.

You might also derive useful comparisons from the Théorème de croissances comparées for which I don't know of any equivalent in English literature, but which is simply the following

For all $a>0$ and $b>0$ $$ x^a = o_{+∞}(e^{bx}) $$ $$ \ln(x)^a = o_{+∞}(x^b) $$ and $$ \lvert\ln(x)\rvert^a = o_{0}(x^b) $$


1. Which is messed up, because those are obsolete, but there are not other convenient notations for asymptotic domination as a (strict partial) order. What I was taught was that the Vinogradov notation $f ≪ g$ stood for $f=o(g)$, but apparently it is $f=O(g)$ instead. We really need a standardization of those things.


Two useful things to know in calculating limits are:

  • For all $a>1,\alpha > 0$, there exists some $N$ such that $\log_a n < n^\alpha$ for $n>N$
  • For all $a>1, \alpha > 0$, there exists some $N$ such that $a^n > n^\alpha$ for all $n>N$

In other words, the exponential function is faster than any power, and log is slower than any power.