We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
I have a function in which I declare two variables as follows: unsigned char voltage = 0; unsigned char test = 0; When I run the code in the debugger, these variables are initialized to random numbers. What is the problem?
Just a quick thought - where are you initialising them? If it is just in the function you've written, then they would in theory only exist while the function was being executed, but not while any other piece of the code was in use. To check this out - you could try declaring the variables globally, and see if it makes a difference. What do you reckon? Yours, Richard.
I am only using them in the function and nowhere else. I attempted to declare them globally and then use them, but I had the same problem. Thanks.
Okay - my next guess would be to suggest that they are possibly being over written by something else. For example, one which I did recently was to declare an array of size 50, but then write to elements 1 to 50, rather than 0 to 49 (unfortunatly for tools such as Matlab, this would have been correct). The way I spotted it was that the 'random' numbers that I was seeing in my overwritten variable, were about the same range of values that one of my other variables should have been (apparently this is one of the benefits of Java - I'm told that this has a mechanism for preventing you doing this). Do you think this could be causing it? Also, can you verify that what the debugger is telling you is correct in some way? Yours, Richard.
What memory model do you use? If you use LARGE, do you actually have any XDATA memory? Jon
"Okay - my next guess would be to suggest that they are possibly being over written by something else." I that's the case, you should be able to see it happening in the debugger: use the disassembler view, and you'll be able to see where your local variables are initialised. Step through the code at the assembler level, and see where it goes wrong. This should give you some more clues. Other possible causes could be: Errant pointers; Overlay problems (especially if you're using function pointers, or calling functions recursively); Stack problems.
Beware of the optimiser. Does the problem go away when you turn down the level of optimisation? The optimiser can do things that the debugger is not entirely aware of and, therefore, seems to give strange results. Of course, your algorithm should still work correctly, optimisation should not change that.
"Of course, your algorithm should still work correctly, optimisation should not change that" Unless your algorithm is based upon a false assumption, which the optimiser invalidates?