We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
I have a function in which I declare two variables as follows: unsigned char voltage = 0; unsigned char test = 0; When I run the code in the debugger, these variables are initialized to random numbers. What is the problem?
"Of course, your algorithm should still work correctly, optimisation should not change that" Unless your algorithm is based upon a false assumption, which the optimiser invalidates?