Could someone please explain the following: I tried using sizeof command to find out the size of the an int variable by printf("%d\n", sizeof(int)); I was expecting an output of 2. But interestingly the value that I got was 512. Can anyone explain the significance of this value? I tried testing the the size of other variables and I discovered something interesting. 512 in decimal is actually 0200 in hexadecimal. Ignoring the last two bytes, the first two bytes actually gives the value 2 which I was expecting. Similarly, for the size of a character, the value I got was 256. When converted to hexadecimal it is 0100. Looking at the first two bytes actually gives the data size. Could someone please enlighten me why is this so?
I followed that link and was a bit surprised: The knowledgebase article has been updated to be correct all the time. Jon