It won't error out since its not technically a bad type. A char is just a byte, and a byte is a lower "quality" primitive than the int. So you can always stick a lower quality primitive into a "higher" datatype, and more often than not without an explicit conversion.
Effectively, if you can stuff a value into a variable without losing precision, it will attempt to implicitly cast the value to match it. Byte -> Short Int -> Int -> long Int -> float -> double, but you cannot go the other way. This is the same behaviour that C follows.