Could someone please explain the following: I tried using sizeof command to find out the size of the an int variable by printf("%d\n", sizeof(int)); I was expecting an output of 2. But interestingly the value that I got was 512. Can anyone explain the significance of this value? I tried testing the the size of other variables and I discovered something interesting. 512 in decimal is actually 0200 in hexadecimal. Ignoring the last two bytes, the first two bytes actually gives the value 2 which I was expecting. Similarly, for the size of a character, the value I got was 256. When converted to hexadecimal it is 0100. Looking at the first two bytes actually gives the data size. Could someone please enlighten me why is this so?
So a C51 int should really be 8 bits, since the 8051 is an 8-bit processor... Except for the requirement that an int must be able to support at least the absolute magnitude for INT_MIN and INT_MAX as defined in limits.h. Therefore, an ISO-conforming implementation cannot represent an int type in only eight bits. Geez. I'm glad we don't make a compiler for a 4-bit architectures. Jon