We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
Think it is some basic question.
I have one ARM7TDMI [LPC 2138] controller operating on 12 MHz clock freq.
Suppose I have my programe performing 10 Lac comparisons, in one operation [function]. like below
/***************/ FunctionX { for(...) // 1000 times for(...) // 1000 times { if (Value > MeanValue) } /* where Value and MeanValue both are unsigned char */ } /*************/
Can any one guide me how I can derive how much time my function will take to perform 1000*1000 = 10 lac comparisons. ?
Look up "Performance Monitor" in the uVision manual.
And at some point you'll have to learn the difference between "time complexity" and "actual runtime".