Why does adding a '0' to an int digit allow conversion to a char?

When ASCII encoding is used, the integer value of '0' is 48.

'0' + 1 = 49 = '1'
'0' + 2 = 50 = '2'

...

'0' + 9 = 57 = '9'

So, if you wanted convert a digit to its corresponding character, just add '0' to it.

Even if the platfrom uses non-ASCII encoding, the lanuage still guarantees that the characters '0' - '9' must be encoded such that:

'1' - '0' = 1
'2' - '0' = 2
'3' - '0' = 3
'4' - '0' = 4
'5' - '0' = 5
'6' - '0' = 6
'7' - '0' = 7
'8' - '0' = 8
'9' - '0' = 9

When ASCII encoding is used, that becomes:

'1' - '0' = 49 - 48 = 1
'2' - '0' = 50 - 48 = 2
'3' - '0' = 51 - 48 = 3
'4' - '0' = 52 - 48 = 4
'5' - '0' = 53 - 48 = 5
'6' - '0' = 54 - 48 = 6
'7' - '0' = 55 - 48 = 7
'8' - '0' = 56 - 48 = 8
'9' - '0' = 57 - 48 = 9

Hence, regardless of the character encoding used by a platform, the lines

int i = 2;
char c = i + '0';

will always result in the value of c being equal to the character '2'.


If you look at the ASCII table, asciitable, you'll see that the digits start at 48 (being '0') and go up to 57 (for '9'). So in order to get the character code for a digit, you can add that digit to the character code of '0'.