What is the argument for printf that formats a long?

Put an l (lowercased letter L) directly before the specifier.

unsigned long n;
long m;

printf("%lu %ld", n, m);

I think you mean:

unsigned long n;
printf("%lu", n);   // unsigned long

or

long n;
printf("%ld", n);   // signed long

On most platforms, long and int are the same size (32 bits). Still, it does have its own format specifier:

long n;
unsigned long un;
printf("%ld", n); // signed
printf("%lu", un); // unsigned

For 64 bits, you'd want a long long:

long long n;
unsigned long long un;
printf("%lld", n); // signed
printf("%llu", un); // unsigned

Oh, and of course, it's different in Windows:

printf("%l64d", n); // signed
printf("%l64u", un); // unsigned

Frequently, when I'm printing 64-bit values, I find it helpful to print them in hex (usually with numbers that big, they are pointers or bit fields).

unsigned long long n;
printf("0x%016llX", n); // "0x" followed by "0-padded", "16 char wide", "long long", "HEX with 0-9A-F"

will print:

0x00000000DEADBEEF

Btw, "long" doesn't mean that much anymore (on mainstream x64). "int" is the platform default int size, typically 32 bits. "long" is usually the same size. However, they have different portability semantics on older platforms (and modern embedded platforms!). "long long" is a 64-bit number and usually what people meant to use unless they really really knew what they were doing editing a piece of x-platform portable code. Even then, they probably would have used a macro instead to capture the semantic meaning of the type (eg uint64_t).

char c;       // 8 bits
short s;      // 16 bits
int i;        // 32 bits (on modern platforms)
long l;       // 32 bits
long long ll; // 64 bits 

Back in the day, "int" was 16 bits. You'd think it would now be 64 bits, but no, that would have caused insane portability issues. Of course, even this is a simplification of the arcane and history-rich truth. See wiki:Integer