We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
I have a function in which I declare two variables as follows: unsigned char voltage = 0; unsigned char test = 0; When I run the code in the debugger, these variables are initialized to random numbers. What is the problem?
Beware of the optimiser. Does the problem go away when you turn down the level of optimisation? The optimiser can do things that the debugger is not entirely aware of and, therefore, seems to give strange results. Of course, your algorithm should still work correctly, optimisation should not change that.
"Of course, your algorithm should still work correctly, optimisation should not change that" Unless your algorithm is based upon a false assumption, which the optimiser invalidates?