This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Unexpected output from "sizeof"

Could someone please explain the following:

I tried using sizeof command to find out the size of the an int variable by

printf("%d\n", sizeof(int));

I was expecting an output of 2. But interestingly the value that I got was 512. Can anyone explain the significance of this value?

I tried testing the the size of other variables and I discovered something interesting. 512 in decimal is actually 0200 in hexadecimal. Ignoring the last two bytes, the first two bytes actually gives the value 2 which I was expecting.

Similarly, for the size of a character, the value I got was 256. When converted to hexadecimal it is 0100. Looking at the first two bytes actually gives the data size.

Could someone please enlighten me why is this so?

  • I'm glad we don't make a compiler for a 4-bit architectures

    Just write a compiler for a 1-bit architecture, and run it really fast to approximate a slower 16-bit architecture.

    That parallel bus stuff is just so 20th century.

  • Thus we have an inherent conflict on an 8-bit platform!

    Not really. The "natural size" is not actually a requirement or defintion, but just a suggestion. It can't be a requirement anyway: "natural size" is way to sloppy a word for such usage.

    The lower boundarys for INT_MAX and -INT_MIN on the other hand are strict requirements.

    So, in this case, the strict requirement simply overrules the suggestion. Keil is fully correct here, using 16-bit ints even though it's an 8-bit platform. Keeping in mind the minimal requirements of INT_MAX and INT_MIN, 16-bit ints are the "natural size suggested by the execution environment".