The following is my answer. When I answered this question before, I was a single-chip microcomputer development enthusiast. The int in C5 1 MCU is really 16 bits, two bytes.
Now engaged in application software development. In these compilers (such as vs and gcc), int is generally 4-bit (whether 32-bit or 64-bit). In fact, the official explanation is that the compiler can choose the appropriate size according to its own hardware, but it needs to meet the constraints: short and int types should be at least 16 bits, long types should be at least 32 bits, and the length of short types should not exceed the length of int types, and int types should not exceed the length of long types. This means that the length of each type of variable is determined by the compiler.