my code is 110% correct; eg no errors and runs properly. see.
int _val; int myfunc2(int val) { _val = val; return _val; }; int Bar(int val) { return _val + val + 1; }; void myfunc1(int val) { _val += Bar(val); }; etc etc etc
it doesnt give me the right answer sometime.
HEEEEEEEELLLLLLLPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP
my code is 110% correct; eg no errors and runs properly.
You're deluding yourself. Even setting aside the pure lunacy of claiming an above-unity percentage of correctness for anything, that code is clearly not 100% correct.
No. That code cannot give you any answer qualifiable as "right" or "wrong" at all, for the simple reason that it doesn't do anything, by itself.
The bandwith wasted for that nonsense could have been put to much better use if you had actually stated a question, instead of just dumping some unrelated shards of code and expecting people to read you minds regarding what you think is actually wrong with it.
Hmmm ...
http://www.keil.com/forum/16227/
which code is better? Please vote: [ ] Totally bug free code [ ] 100% bug free code [ ] 110% bug free code
It is so HARD to write even moderately bug free code. Circumstances might change. Components could be replaced rendering once functioning code useless. Slight differences between processors might induce a huge impact on large program constructs. EMC is always there. System load will determine "correctness", too. Tool chain (settings) might play a role. System that "live on the edge" in terms of timing might fail without warning (I made such a mistake 1.5 years ago only to fix it 3 weeks ago!).
110%?
I believe 70% is optimistic!
I just set the compiler switch --quality=100 and take it from there.
Then I know that if the code goes through the compiler without any reported warnings/errors, everything is dandy. It's the same as with jpeg images - if you drop the quality parameter, you get lossy results.
(I made such a mistake 1.5 years ago only to fix it 3 weeks ago!).
That's nothing. I've fixed a bug like that in a subroutine which had successfully gone through more than 10 years of continuos re-use all across the company. Then mine happened to be the project where all the external conditions (speed of CPU, speed of peripheral device, etc.) were just right to trip the lurking bug. The central code sequence had been exactly the wrong way round all that time.
Yes, that sounds pretty nasty. Mine lived on for a long time undetected. Then, starting at a particular release, it was relatively easy to reproduce (reason remains a big mystery, still). The timing margins addressing an external analog output via the SPI bus were not conservative enough, so that some (one?) of the 12 bits heading towards one of the DAC's banks ended up nowhere or at an identical, separate DAC (there is a chip select there). Because the DAC have a shift register to input the data, even the loss of 1 bit meant huge swings! That was fun to watch :-) but much more fun to see with a scope and solve...
The false (but very widely-held) belief that, because the stuff "works" (sic), it must, therefore, be right...
It has happened to all of us . Here is my 'war story'. http://www.keil.com/forum/docs/thread15893.asp - 8.6K - Nov 8, 2009 Bradford
And Per's even better story. Bradford
View all questions in Keil forum