There must be a good introductory numerical analysis course out there!

I believe our Numerics course was very interesting. Basically we had the reverse order of the structure in your example.

Numerics (1-Year-Course)

  1. Motivational example, heat transfer between two Points. We discretized the Problem and derived a way to solve it (1-dimensional FDM). From this, we then moved on to multiple dimensions and time dependency (FTCS, etc..), introducing error estimates along the way.

  2. As obviously each problem boils down to linear equations, we looked at a few of the different iterative algorithms (Gradient, CG, Multigrid...) and of course error estimates and Matrix conditions.

  3. We then went on to Interpolation methods, Splines and Co., to replace our linear Ansatz from before.

  4. Next, we looked at Quadrature. Even without motivation, it was clear to us that this was usefull

  5. At this point, we were able to take short detours and look at different other fields of Numerics briefly, for example, Finite Volume Method. We also took a look at things we had left out, like Newton Method (alot of which were introduced in other lectures).

  6. We finalized the course with the Finite Element Method (as this is a core research field at our University), starting with Ritz-Galerkin and ending at a-posteriori error estimates. (Althoug this would need some basic knowledge in Functional Analysis)

I'm the kind of student that will follow a lecture with alot of interest if there is a strong/reasonable motivation behind it. Or at least some sort of "big-picture".

Perhaps to your liking, we had a heavy emphasize on Error estimates. We had alot of real life/hands on examples in between highlighting how important this is. (http://www.ima.umn.edu/~arnold/disasters/sleipner.html)

Further, I have to point out (as briefly mentioned in point 5), that alot of things were already introduced in some other lectures. Mainly our physics lectures required some basic Numerics, so this was not a complete introduction to Numerics.


John Hubbard tends to take sort of the opposite track, in that he likes to bring a more serious numerical analysis perspective into the 1st and 2nd courses on calculus and differential equations, rather than assuming the students come out of a standard service-stream calculus, differential equations, linear algebra sequence of courses. Usually this includes a discussion of various ways of representing numbers on computers, like floating-point numbers, round-off errors, perhaps even topics like interval arithmatic.

For example, once the idea of ODEs are set up he likes to talk about "fences". I don't know if this is standard terminology anywhere or just his, but it's basically like a Lyapanov function but for time-dependent ODEs. So it gives you regions that trap orbits, but the region may move with time. He gets students used to thinking in this way gradually, by cooking up fences in the 1-dimensional time-dependent ODE case first. Then he moves on to things like the Gronwall inequality, applying it for things like the Euler approximations to ODE solutions to observe error growth rates. He also proves Kantorovich's theorem, which he uses for the implicit and inverse function theorems. He has quite a lot of success getting 1st and 2nd year physics and engineering students thinking about these things. But it's known as the "challenging" calculus stream at Cornell, and less ambitious students have other options. I don't know what their numbers are now, but when I was a TA for the course I think he was getting around 80 students per year in the course.


When I took a course on numerical analysis a couple of years ago I very much liked the book "An introduction to numerical analysis" by Suli and Mayers, it is very clear and concise. In particular it contains a lot of rigorous error estimates.