greetings ,,,, how long will this delay go
int delay ( int start , int end ) { int counting;
counting = start;
while ( counting < end ) ++ counting ;
return 7 ; }
what is this, broken code day? why not start counting from 0 to down to 0? what about effects of compiler optimization? this does not even compile! and why returning 7 seems to make any sense? lucky number?
See: www.8052.com/.../162556
delays in 'C' are by definition undefined.
Erik
Think about what time of year it is... school has just started and the first problem sets will be do soon
hey......nice code chum.......
ive run the code and here is the result we see
delay ( 1 , 1000 ) is 97S delay ( 1 , 10000 ) is 164S delay ( 1000 , 2000 ) is 103S delay ( 1 , 60000 ) is 4S delay ( 0 , 100000) is 22S
"ive run the code and here is the result we see
delay ( 1 , 1000 ) is 97S"
You haven't said what compiler options you used;
You haven't said what processor you used;
You haven't said what clock frequency you used;
97S = 97 siemens; which is a measure of conductance
en.wikipedia.org/.../Siemens_(unit)
"delay ( 1 , 1000 ) is 97S"
Assuming that the 97S is meant to mean seconds, then can we conclude that the processor is an abacus and the clock frequency (finger frequency ?) is quite high.
Oh, yeah.
here is the result we see
So, you're not alone in your clueless quest?