First, the bytes are different.
1, primitive type: In a 32-bit C compiler, primitive type int takes up 4 bytes.
2, short integer: in the 32-bit C compiler, short integer short? Int takes up two bytes.
Second, the accuracy is different.
1, basic type: the basic type is converted into a short integer, and the low-order bytes of the truncated basic type will be put into the short integer, which will lose precision.
2. Short integer: short integer is converted into basic type, and all bytes of short integer are put into basic type without losing precision.
Third, the modifiers are different.
1, basic type: the basic type has a signed positive modifier by default, and can have a signed positive? Or a negative unsigned modifier.
2. Short integer: short integer with sign by default? Positive modifier, not negative unsigned modifiers.