Should I bother teaching buffer overflows any more?

Absolutely. ASLR and DEP are defense-in-depth measures. There are exploits that exist that can bypass each of them (for a real-world example, look at Peter Vreugdenhil's Pwn2Own exploit that he used against IE).

All you need to bypass ASLR for Windows is an information disclosure vulnerability that will let you know the base address of a loaded DLL in the process (that was the first vuln that Vreugdenhil exploited). From that, you can use a ret-to-libc attack to call any function in that DLL.

The bottom line: stack (and heap) overflows are absolutely still relevant today. They're harder to exploit than they used to be but they're still relevant.


Besides @Larry's and @SteveS's excellent concise answers, I want to point out a very important point:

The students are skeptical that turning off non-executable stacks, turning off canaries and turning off ASLR represents a realistic environment.

Hopefully this is true for your students' systems.
In the rest of the world, however, this is still very common, unfortunately. Besides the platforms that don't support these, there are always poorly build products that require shutting these off, older versions of OS, and even just bad misconfigurations.
Still very realistic, sadly.

On top of all that, 2 more comments from an educatory pov:
1. somebody has to build those defenses, right?
2. Even if hypothetically they were right - you only need pointers in C/C++ doesnt mean a Java developer shouldnt learn how these things work, inside the computer, right?


Yes. Apart from the systems where buffer overflows lead to successful exploits, full explanations on buffer overflows are always a great way to demonstrate how you should think about security. Instead on concentrating on how the application should run, see what can be done in order to make the application derail.

Also, regardless of stack execution and how many screaming canaries you install, a buffer overflow is a bug. All those security features simply alter the consequences of the bug: instead of a remote shell, you "just" get an immediate application crash. Not caring about application crashes (in particular crashes which can be triggered remotely) is, at best, very sloppy programming. Not on my watch!

For completeness, non-executable stacks and canaries do not prevent buffer overflows; they just shut off some of the easy ways to exploit buffer overflows. The traditional buffer overflow is about replacing the return address with a pointer to malicious code which is loaded as part of the data which overflowed the buffer; the malicious code gets executed when the function returns. The non-executable stack means that the attacker will not be able to place his code on the stack (he will have to arrange for a jump into some library code, e.g. the execve() implementation in standard library). The canary prevents the return address from being used if a stack buffer was overflowed (assuming the overflow is "simple": a contiguous chunk of data). But the overflow may also overwrite some other data, including function pointers (in particular in the context of C++).