Hi all Below is the result of some debugging. I have isolated some code from a bigger project and have put it into a stand-alone project.
I basically don't understand why I can do right bit-shifting and AND'ing on single line, when I can't do left bit-shifting and AND'ing on single line.
If it isn't a bug, then what have I missed?
I have included many comments to describe my problem
#include <ADUC832.H> #include <string.h> void main(void) { char ascii[] = "MC"; unsigned char pdu[3]; int w=0, r=0, len; char ch1, ch2, rr, rl; /* This is what I want to do: while-loop run 1: 1: Assign to var 'ch1': ch1 = 'M' (= 0x4D = 0100 1101) 2: Assign to var 'ch2': ch2 = 'C' (= 0x43 = 0100 0011) 3: Assign to var 'w' : w = 0 4: OR together the following: ((ch1 >>(w%7))&0x7F) | ((ch2 <<(7-(w%7)))&0xFF); <=> 0100 1101 | 1000 0000 <=> 1100 1101 <=> 0xCD while-loop run 2: 1: Assign to var 'ch1': ch1 = 'C' (= 0x43 = 0100 0011) 2: Assign to var 'ch2': ch2 = 0x00 3: Assign to var 'w' : w = 1 4: OR together the following: ((ch1 >>(w%7))&0x7F) | ((ch2 <<(7-(w%7)))&0xFF); <=> 0010 0001 | 0000 0000 <=> 0010 0001 <=> 0x21 */ len=strlen(ascii); while (r<len) { // ------ First OR-part ----------------------- // -------Both versions below are OK ---------- // -- VER 1: OK // ch1 = ascii[r]; // rr = (w%7); // ch1 = (ch1 >> rr) & 0x7F; // -- VER 2: OK ch1 = (ascii[r] >> (w%7)) & 0x7F; // Bit-shifting and AND'ing // may be done in one line // ------ Second OR-part ----------------------------- //------- Both versions below are NOT OK ?? ---------- // -- VER 1: OK ch2 = ascii[r+1]; rl = (7-(w%7)); ch2 = (ch2 << rl) & ((char)0xFF); // Bit shift and AND'ing can be // done in one line, IF type cast // is used - why? // ch2 = ch2 & 0xFF; // If splitting into new line // type cast is not required? // -- VER 2: NOT OK // ch2 = (ascii[r+1] << (7-(w%7))) & 0xFF; // type cast doesn't help // ch2 = ch2 & 0xFF; // AND'ing must be on seperate line ? //---------------------------------------------------------------- // IS THIS A BUG ?? //---------------------------------------------------------------- // Why can we bit-shift and do the AND'ing in a single line // for the first OR-part above, but cannot do it for the second // OR-part where bit-shifting and AND'ing must be on two seperate // lines ??? //---------------------------------------------------------------- // ------ Do the actual OR'ing ------- pdu[w]= (ch1 | ch2) ; if ((w%7)==6) r++; r++; w++; } pdu[w]=0; // terminator //---------------------------------------------------------------- // Run to here in debugger and look at content of // local variable 'pdu'. // When using 'NOT OK' versions from above // pdu will contain {0x4D, 0x21, 0x00} // and not {0xCD, 0x21, 0x00} as the 'OK' versions // produce. //---------------------------------------------------------------- while(1); }
// ch2 = (ascii[r+1] << (7-(w%7))) & 0xFF; // type cast doesn't help
How would you have type-cast here ?
The way it is right now, it looks to the compiler like you're ANDing a char variable with 0xFF, which doesn't really make sense and can be optimized out.
AND'ing a char with 0xFF should leave the char unmodified, right? Then why isn't that the case? Try run the code in the keil debugger.
Your right, a type cast shouldn't really be included. But I just experience that this produce correct results
ch2 = ascii[r+1]; // ch2 is a char rl = (7-(w%7)); // rl is a char ch2 = (ch2 << rl) & ((char)0xFF); // Use type cast
and this doesn't (i.e. without type cast)
ch2 = ascii[r+1]; // ch2 is a char rl = (7-(w%7)); // rl is a char ch2 = (ch2 << rl) & (0xFF); // don't use type cast
Ok, I can do everything in a single line:
pdu[w] = ((ascii[r] >> (w%7)) & 0x7F) | (ascii[r+1] << (7-(w%7)));
But still, why do I get incorrect results when I AND a char with 0xFF?
But still, why do I get incorrect results when I AND a char with 0xFF?<p>
Have you tried different optimization levels ?
Can you post the resulting assembly ?
I sent an 'inconsistency report' to support re the following which may be what this is about (all are U8 except GSloadCnt which is U16
GCselect = SEL_S882C ; GSloadCnt = GClwdt * GClhgt; // momma GSloadCnt = GSloadCnt / GX_ATT.FSLlin; // momma GSloadCnt = GSloadCnt / 2; // momma works this (which is the same concatenated) GSloadCnt = (((GClwdt * GClhgt) / GX_ATT.FSLlin) / 2); gives the wrong result (zero) but this gives the right result GSloadCnt = (((GClwdt * GClhgt) / (U16) GX_ATT.FSLlin) / 2);
I did not go for an extended study of the C standard to see what was right and what was wrong, but I do believe that whether the typecast is required or not should be the same in both cases.
Erik
You don't show your data declarations. Are you using signed or unsigned data?
Note that the compiler normally performs operations on integers. If the compiler upgrades your signed characters to 16-bit integers, the 0x7f constant will be converted to 0x007f (which gives expected result) but the 0xff constant will become 0xffff when sign-extended.
Hence, if sign-extends are performed, the right-shift will work, but the left-shift will fail.
Try to write (unsigned char)0xff or 0xffu.
I try to avoid signed :) in the above all are unsigned.
anyhow, whether signed or onsigned, the need or none for typecast should be the same in both cases.
PS what 'constants'
ages ago, someone commented something like 'momma help' at some code that was temporary/to be implemented and somehow it stuck. Thus you see //momma for the above temporary code.
View all questions in Keil forum