Skip to content
Advertisement

Java: Subtract ‘0’ from char to get an int… why does this work?

This works fine:

int foo = bar.charAt(1) - '0';

Yet this doesn’t – because bar.charAt(x) returns a char:

int foo = bar.charAt(1);

It seems that subtracting ‘0’ from the char is casting it to an integer.

Why, or how, does subtracting the string ‘0’ (or is it a char?) convert another char in to an integer?

Advertisement

Answer

That’s a clever trick. char’s are actually of the same type / length as shorts. Now when you have a char that represents a ASCII/unicode digit (like ‘1’), and you subtract the smallest possible ASCII/unicode digit from it (e.g. ‘0’), then you’ll be left with the digit’s corresponding value (hence, 1)

Because char is the same as short (although, an unsigned short), you can safely cast it to an int. And the casting is always done automatically if arithmetics are involved

User contributions licensed under: CC BY-SA
6 People found this is helpful
Advertisement