This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Unexpected output from "sizeof"

Could someone please explain the following:

I tried using sizeof command to find out the size of the an int variable by

printf("%d\n", sizeof(int));

I was expecting an output of 2. But interestingly the value that I got was 512. Can anyone explain the significance of this value?

I tried testing the the size of other variables and I discovered something interesting. 512 in decimal is actually 0200 in hexadecimal. Ignoring the last two bytes, the first two bytes actually gives the value 2 which I was expecting.

Similarly, for the size of a character, the value I got was 256. When converted to hexadecimal it is 0100. Looking at the first two bytes actually gives the data size.

Could someone please enlighten me why is this so?

Parents
  • "... the value I got was 256. When converted to hexadecimal it is 0100. Looking at the first two bytes [sic] actually gives the data size."

    Note that you mean the first two digits - not the first two bytes!

    A sinlge hex digit represents (up to) four bits; ie, half a byte - widely called a "nibble"

Reply
  • "... the value I got was 256. When converted to hexadecimal it is 0100. Looking at the first two bytes [sic] actually gives the data size."

    Note that you mean the first two digits - not the first two bytes!

    A sinlge hex digit represents (up to) four bits; ie, half a byte - widely called a "nibble"

Children