Could someone please explain the following: I tried using sizeof command to find out the size of the an int variable by printf("%d\n", sizeof(int)); I was expecting an output of 2. But interestingly the value that I got was 512. Can anyone explain the significance of this value? I tried testing the the size of other variables and I discovered something interesting. 512 in decimal is actually 0200 in hexadecimal. Ignoring the last two bytes, the first two bytes actually gives the value 2 which I was expecting. Similarly, for the size of a character, the value I got was 256. When converted to hexadecimal it is 0100. Looking at the first two bytes actually gives the data size. Could someone please enlighten me why is this so?
I would like to add that I am using the C51 compiler and using the debugger to simulate the output at the serial port. Thank you.
http://www.keil.com/support/docs/655.htm Jon
I followed that link and was a bit surprised: /* Workaround #1 */ /* Cast the sizeof to 'unsigned int' */ printf("%d", (unsigned int) sizeof(x)); /* Workaround #2 */ /* Use '%bd' to tell printf that a character is passed */ printf("%bd", sizeof(x)); Workaround #2 only works for values up to 127. Why does workaround #1 cast to unsigned int then use the signed int conversion specifier? Even more bizarre, why does workaround #2 pass the (presumably unsigned) char result of sizeof then convert it with the signed specifier, noting that this "only works up to 127", rather than using the unsigned specifier? Presumably the result of sizeof is treated as unsigned int if the size of the object exceeds 255, in which case workaround #2 is no better than the problem it is supposed to address. I did try and find sizeof in the manual without success. Stefan
I followed that link and was a bit surprised: The knowledgebase article has been updated to be correct all the time. Jon
View all questions in Keil forum