We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
my code is 110% correct; eg no errors and runs properly. see.
int _val; int myfunc2(int val) { _val = val; return _val; }; int Bar(int val) { return _val + val + 1; }; void myfunc1(int val) { _val += Bar(val); }; etc etc etc
it doesnt give me the right answer sometime.
HEEEEEEEELLLLLLLPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP
"my code is 110% correct"
Not really - for a start, user code should not define identifiers with leading underscores!
"no errors"
If is entirely possible - and all too common - to write some very bad code that compiles with zero errors.
See http://www.ioccc.org/ for some prime examples!
"runs properly ... doesnt give me the right answer sometime"
So it's not running properly, then - is it?!
"HEEEEEEEELLLLLLLPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP"
How?
You've just shown a few function definitions; you haven't shown how you combine all this into a real program.
You haven't explained what you think should be the "right" answers.
You haven't explained what answers you get - and why you think ther are "wrong".
it doesnt give me the right answer sometime
The sometimes part usually points to resource sharing conflicts in a system with multitasking or interrupts. But you'll have to provide some information if you want to get meaningful help.
Watchout for functions modifying global variables and called as part of expressions that contains same global variables.
You have to look out for evaluation order problems.
And depending on your code, you may have to consider aliasing problems too - multiple access ways to the same variable making the compiler fail to notice that one access way changes the variable while the compiler sees the other access way and think the variable must be unchanged.
Think twice about your use of global variables.
Some people would say that the code is bad simply because it uses global variables!
I don't agree that globals are inherently bad but, as Per says, they certainly do bring issues that you have to consider carefully...
my code is 110% correct should read "my code compiles with no erors, but is not working"
Erik
my code is 110% correct; eg no errors and runs properly.
You're deluding yourself. Even setting aside the pure lunacy of claiming an above-unity percentage of correctness for anything, that code is clearly not 100% correct.
No. That code cannot give you any answer qualifiable as "right" or "wrong" at all, for the simple reason that it doesn't do anything, by itself.
The bandwith wasted for that nonsense could have been put to much better use if you had actually stated a question, instead of just dumping some unrelated shards of code and expecting people to read you minds regarding what you think is actually wrong with it.
Hmmm ...
http://www.keil.com/forum/16227/
which code is better? Please vote: [ ] Totally bug free code [ ] 100% bug free code [ ] 110% bug free code
It is so HARD to write even moderately bug free code. Circumstances might change. Components could be replaced rendering once functioning code useless. Slight differences between processors might induce a huge impact on large program constructs. EMC is always there. System load will determine "correctness", too. Tool chain (settings) might play a role. System that "live on the edge" in terms of timing might fail without warning (I made such a mistake 1.5 years ago only to fix it 3 weeks ago!).
110%?
I believe 70% is optimistic!
I just set the compiler switch --quality=100 and take it from there.
Then I know that if the code goes through the compiler without any reported warnings/errors, everything is dandy. It's the same as with jpeg images - if you drop the quality parameter, you get lossy results.
(I made such a mistake 1.5 years ago only to fix it 3 weeks ago!).
That's nothing. I've fixed a bug like that in a subroutine which had successfully gone through more than 10 years of continuos re-use all across the company. Then mine happened to be the project where all the external conditions (speed of CPU, speed of peripheral device, etc.) were just right to trip the lurking bug. The central code sequence had been exactly the wrong way round all that time.
Yes, that sounds pretty nasty. Mine lived on for a long time undetected. Then, starting at a particular release, it was relatively easy to reproduce (reason remains a big mystery, still). The timing margins addressing an external analog output via the SPI bus were not conservative enough, so that some (one?) of the 12 bits heading towards one of the DAC's banks ended up nowhere or at an identical, separate DAC (there is a chip select there). Because the DAC have a shift register to input the data, even the loss of 1 bit meant huge swings! That was fun to watch :-) but much more fun to see with a scope and solve...
The false (but very widely-held) belief that, because the stuff "works" (sic), it must, therefore, be right...
It has happened to all of us . Here is my 'war story'. http://www.keil.com/forum/docs/thread15893.asp - 8.6K - Nov 8, 2009 Bradford
And Per's even better story. Bradford
"Then mine happened to be the project where all the external conditions (speed of CPU, speed of peripheral device, etc.) were just right to trip the lurking bug."
I am not convinced that it is a bug.
I think it is practically impossible to write completely bug-free code that works under all circumstances.
Instead, we all write code pieces that have "limitations" that unravel under certain conditions. it is our job to document such limitations so that when we are in a circumstance where the limitations become reality, we know that we need to fix the code for that particular application.