Strange definitions of TRUE and FALSE macros

Let's see: '/' / '/' means the char literal /, divided by the char literal '/' itself. The result is one, which sounds reasonable for TRUE.

And '-' - '-' means the char literal '-', subtracted from itself. This is zero (FALSE).

There are two problems with this: first, it's not readable. Using 1 and 0 is absolutely better. Also, as TartanLlama and KerrekSB have pointed out, if you are ever going to use that definition, please do add parentheses around them so you won't have any surprises:

#include <stdio.h>

#define TRUE  '/'/'/'
#define FALSE '-'-'-'

int main() {
        printf ("%d\n", 2 * FALSE);
        return 0;
}

This will print the value of the char literal '-' (45 on my system).

With parentheses:

#define TRUE  ('/'/'/')
#define FALSE ('-'-'-')

the program correctly prints zero, even though it doesn't make much sense to multiply a truth value by an integer, but it's just an example of the kind of unexpected bugs that could bite you if you don't parenthesize your macros.


It's just another way of writing

#define TRUE 1
#define FALSE 0

The expression '/'/'/' will divide the char value of '/' by itself, which will give 1 as a result.

The expression '-'-'-' will substract the char value of '-' from itself, which will give 0 as a result.

Brackets around the whole define expressions are missing though, which can lead to errors in the code using these macros. Jay's answer adresses that pretty well.

An example of "real-life" scenario where forgetting the brackets can be harmful is the combined use of these macros with a C-style cast operator. If someone decides to cast these expressions to bool in C++ for instance:

#include <iostream>

#define TRUE  '/'/'/'
#define FALSE '-'-'-'

int main() {
    std::cout << "True: " << (bool) TRUE << std::endl;
    std::cout << "False: " << (bool) FALSE << std::endl;
    return 0;
}

Here's what we get:

True: 0
False: -44

So (bool) TRUE would actually evaluate to false, and (bool) FALSE would evaluate to true.


Jay already answered why the values of these expressions are 0 and 1.

For history sake, these expressions '/'/'/' and '-'-'-' come from one of the entries of 1st International Obfuscated C Code Contest in 1984:

int i;main(){for(;i["]<i;++i){--i;}"];read('-'-'-',i+++"hell\
o, world!\n",'/'/'/'));}read(j,i,p){write(j/p+p,i---j,i/i);}

(Link to the program here, there is a hint of what this program does in the IOCCC page above.)

Also if I remember correctly these expressions as obfuscated macros for TRUE and FALSE were also covered in "Obfuscated C and Other Mysteries" book by Don Libes (1993).


It is equivalent to writing

#define TRUE 1
#define FALSE 0

What the expression '/'/'/' actually does is dividing the character / (whatever its numeric value is) by itself, so it becomes 1.

Similarly, the expression '-'-'-' subtracts the character - from itself and evaluates to 0.

It would be better to write

#define TRUE ('/'/'/')
#define FALSE ('-'-'-')

to avoid accidental change of values when used with other higher-precedence operators.