unsigned char aa,bb,cc,dd; aa = 0xab; bb = 0xcd; cc = (aa+bb)%255; //aa+bb=0x178 dd = (unsigned char)((aa+bb)%255);
when debug, youcan see that the result: cc is 0x78, dd is 0x79. In fact, cc and dd should be 0x79.
I debugged in C51 9.60, 9.03. Both had the same output.
I tried in VC2010, TI CCS3.3, both can get the correct result, 0x79.
The expectation of getting 0x78 is in direct violation of all applicable standards. For both C and C++. For any standard revision of either of them. Standard integer conversions have been part of C since well before the first C standard.
I.e. the language does, in fact, guarantee the result in this case, even though it might be less than obvious to the casual reader. Getting something else can be compiler bug; but that depends on wether the compiler in question was run in a mode that promises to respect the standard(s). If it promised no such thing, breaking the promise isn't a bug.