What exactly is calculus?

In a nutshell, Calculus (as seen in most basic undergraduate courses) is the study of change and behaviour of functions and sequences. The three main points are:

  • Limits: How sequences and functions behave when getting closer and closer to a desired point (geometrically, what happens when you "zoom in" near a point)
  • Derivatives: How functions change over a parameter (geometrically, the "slope of a graph at a given point")
  • Integrals: What's the cumulative effect of a function (geometrically, the "area under a graph")

And obviously (and maybe especially), how these relate to one another; the crowning jewel of Calculus is probably the Fundamental Theorem of Calculus, which truly lives up to its name and was developed by none other than Leibniz and Newton.


In math, a "calculus" is a set of rules for computing/manipulating some set of objects. For instance the rules $\log AB = \log A+\log B$, etc, are the "logarithm calculus."

But commonly, "calculus" refers to "differential calculus" and "integral calculus." There is a set of rules (product rule, quotient rule, chain rule) for manipulating and computing derivatives. There is also a set of rules (integration by parts, trig substitution, etc.) for manipulating and computing integrals. So, at least etymologically, "calculus" refers to these two sets of rules.

But so far, this has been a bad answer. You really want to know what differential calculus and integral calculus have come to mean. So here goes:

There are a lot of formulas out there that involve multiplying two quantities. $D=RT$, area$ = lw$, work = force x distance, etc. All of these formula are equivalent to finding the area of a rectangle. If you drive 30 miles per hour for 4 hours, your distance is $D = RT = 30\cdot 4 = 120 $ miles. Easy-peasy.

But what if the speed varies during the drive. Now your $30 \times 4$ rectangle is warped. The left and right sides and bottom are still straight, but the top is all curvy. Still, the distance traveled is the area of the warped rectangle. Integral calculus cuts the area into infinitely many, infinitely tiny rectangles, computes the area of all of them and glues them back together to find the area.

How much work to lift a 10 pound rock to the top of a 200 foot cliff? $10 \times 200 $ foot-pounds. But now what if the rock is ice and it melts on the way up, so that when it reaches the top it weighs only 1 pound?

Differential calculus is sort of the opposite problem. When a rock is falling, it starts at $0$ ft/sec., and accelerates. At each point in time, it is going a different speed. Differential calculus gives us a formula for that constantly changing speed. If you graph the position of the rock against time, then the speed is slope of that curve at each point in time.

In both integral and differential calculus, we do nothing more than take the formula for the area of a rectangle and the slope formula for a line and make them sit up and do tricks.


Another way of understanding calculus is that it is the science of refining approximations. The idea is that if we cannot calculate a value directly, we come up with a scheme that allows us to approximate the value as closely as we want. If that scheme is good enough that by sufficiently refining the approximation, we can eliminate all but one particular value as being the number we are after, then we have our answer.

There are a great many values that we cannot calculate directly, but which we can approximate. You mentioned one example: area. From our physical experience, we expect shapes to have a comparable quantity called area. If it takes the same amount of paint to cover two different shapes, then if I paint again, being careful of my thicknesses, it will still take the same amount of paint the second time. To quantify, we can define that a square with sidelength $1$ has area $1$. From this and the concepts that area should not change under rigid motions and that if a shape is divided into two shapes, then the sum of the areas of the parts should be the area of the whole, we can quickly calculate that the area of a rectangle has to be the width $w$ times the height $h$, provided that $w$ and $h$ are both rational values. With a little creativity, we can even show that it holds for some irrational values.

But by direct calculation, we can never show that the area of a rectangle is always its width times its height for all irrational values. And even worse, we cannot arrive at an area for any figure whose boundary is not made of line segments strung together. So we have to find a means to calculate them indirectly. That means is by refining approximations.

While Newton and Liebnitz truly do deserve their titles as the fathers of calculus for their joint invention of the Fundamental Theorem of Calculus, the basic ideas pre-date them - by nearly 2000 years. The key idea is attributed to Eudoxus, though it may predate him as well. That idea is this: If you are comparing two values $x$ and $y$, and can show that $x$ cannot be less than $y$ and $x$ cannot be greater than $y$, then it has to be that $x = y$. Pretty obvious. Let's apply it to finding areas:

Suppose you have some arbitrary shape $S$. We can't directly calculate the area of $S$, but we can cover it with a grid of squares of sidelength $\frac 1n$ for some natural number $n$.

Area by counting squares

Now count the number $M_n$ of squares that overlap $S$ (both blue and tan squares), and the number $m_n$ of squares that are completely inside of $S$ (tan squares only). Since every square of the latter type is also of the former, it is always the case that $m_n \le M_n$. The total area of the covering squares is the sum of the areas of the individual squares, which we already know is $\frac 1{n^2}$, so it will be $\frac {M_n}{n^2}$, and similarly for the contained squares. If $S$ has an area , then it should be true that $$\frac {m_n}{n^2} \le \text{ area of }S \le \frac {M_n}{n^2}$$ for every $n$.

Now for most shapes $S$ we cannot exactly calculate its area this way - not unless $S$ happens to be the union of a bunch of squares. But for nice shapes, we can come up with approximations that are as good as we want. That is, if we say that we want to know the area to a tolerance of $\epsilon$ (epsilon is a traditional variable for this role) for any given $\epsilon > 0$, then by dint of effort we can produce a $n$ big enough that $$0 \le \frac {M_n}{n^2} - \frac {m_n}{n^2} < \epsilon$$ Since the actual area lies between the two, either value differs from the area by an amount less than $\epsilon$.

Now suppose there is a number $A$ that we think should be the area. Let $x < A < y$ for some values $x, y$, and let $\epsilon$ be the smaller of $A - x$ and $y - A$. If we can always find $n$ as above, then $$x = A - (A - x) \le A - \epsilon < \frac {m_n}{n^2} \le \text{area of }S$$ Hence the area cannot be $x$. And similarly, it cannot be $y$. So, per Eudoxus, the area has to be $A$. (If we cannot find such an $n$, then we were wrong about $A$ being the area.)

"Limits" are just a terminology used to describe this concept of refining approximations. Derivatives (slopes of tangent lines to curves) and integrals (areas of regions defined by curves) are two very common and very useful values that usually cannot be calculated directly, and which turn out to be closely related.