What is the difference between static_cast<> and C style casting?

In short:

  1. static_cast<>() gives you a compile time checking ability, C-Style cast doesn't.
  2. static_cast<>() is more readable and can be spotted easily anywhere inside a C++ source code, C_Style cast is'nt.
  3. Intentions are conveyed much better using C++ casts.

More Explanation:

The static cast performs conversions between compatible types. It is similar to the C-style cast, but is more restrictive. For example, the C-style cast would allow an integer pointer to point to a char.

char c = 10;       // 1 byte
int *p = (int*)&c; // 4 bytes

Since this results in a pointer to a 4-byte type, pointing to 1 byte of allocated memory, writing to this pointer will either cause a run-time error or will overwrite some adjacent memory.

*p = 5; // run-time error: stack corruption

In contrast to the C-style cast, the static cast will allow the compiler to check that the pointer and pointee data types are compatible, which allows the programmer to catch this incorrect pointer assignment during compilation.

int *q = static_cast<int*>(&c); // compile-time error

You can also check this page on more explanation on C++ casts : Click Here


C++ style casts are checked by the compiler. C style casts aren't and can fail at runtime.

Also, c++ style casts can be searched for easily, whereas it's really hard to search for c style casts.

Another big benefit is that the 4 different C++ style casts express the intent of the programmer more clearly.

When writing C++ I'd pretty much always use the C++ ones over the the C style.


See A comparison of the C++ casting operators.

However, using the same syntax for a variety of different casting operations can make the intent of the programmer unclear.

Furthermore, it can be difficult to find a specific type of cast in a large codebase.

the generality of the C-style cast can be overkill for situations where all that is needed is a simple conversion. The ability to select between several different casting operators of differing degrees of power can prevent programmers from inadvertently casting to an incorrect type.