I'm trying to print out sizeof() values for some of my structures. They are coming out incorrect. Here is a small table in which I've done this on different platforms: Linux : sizeof(TChannel) = 1460 Windows: sizeof(TChannel) = 1460 8051 : sizeof(TChannel) = 1361 Are there byte-alignment issues perhaps? I have both Linux and Windows defaulting to a 1-byte boundary for byte alignment in structures. Does the 8051 default to something different? Here's my code for the 8051: Debugf( "sizeof(TChannel) = %u\r\n", sizeof( TChannel ) ); I've tried %u, %d, %lu, %bu, %X but can't get the right value. Here's my Debugf() function in case that might be messing things up: void Debugf(BYTE* format, ...) { #ifdef DEBUG xdata BYTE buf[64]; va_list arglist; va_start (arglist,format); vsprintf(buf,format,arglist); va_end (arglist); SendSerialData( buf, strlen( buf ) ); #endif } I don't think the problem is in my SendSerialData() function as that seems to work well. Any ideas?
You don't show the truly important detail: the definition of that struct. Either way, I strongly doubt that a lack of a cast could ever change a value from 1460 to 1361. So the true difference is quite certainly elsewhere. You should probably check the alignment on the PC side more carefully, and compare it to the "real" design size of your struct.
After making sure my structs on Windows were byte aligned with: #ifdef _WINDOWS_ #pragma pack (1) #endif I also discovered that in Windows a BOOL is 4 bytes, which was defined as an unsigned char on 8051 (by me), which is only 1 byte. And a UINT on windows is 4 bytes, which is defined as unsigned int on 8051 (by me) and it's only 2 bytes. #ifndef _WINDOWS_ typedef unsigned char BYTE; typedef unsigned int UINT; typedef unsigned long ULONG; typedef unsigned char BOOL; typedef unsigned int WORD; typedef unsigned long DWORD; typedef unsigned int USHORT; #endif
View all questions in Keil forum