Hi, help me to understand, what is wrong here: When I wrote:
void main() { unsigned char t; t = 0x20; t |= 0x30; t = 0x21; // <- in debuger t=0xF5 !!! initProc(); IOSET = 0x0C; }
void main() { unsigned char t; t |= 0x20; t = 0x30; t = 0x21; // <- now t=0x21 !!! initProc(); IOSET = 0x0C; }
void main() { unsigned char t=0x20; t |= 0x30; t = 0x21; // <- now t=0xF5 again! :( WHY??? initProc(); IOSET = 0x0C; }
Could this be an optimisation issue? You never actually use the value of 't', so the compiler is quite at liberty to optimise it away completely. Here's another thread where the debugger seems to be confused by the compiler's optimisation of an unused variable: http://www.keil.com/forum/docs/thread7144.asp
Thanks, now I see.