We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
Think it is some basic question.
I have one ARM7TDMI [LPC 2138] controller operating on 12 MHz clock freq.
Suppose I have my programe performing 10 Lac comparisons, in one operation [function]. like below
/***************/ FunctionX { for(...) // 1000 times for(...) // 1000 times { if (Value > MeanValue) } /* where Value and MeanValue both are unsigned char */ } /*************/
Can any one guide me how I can derive how much time my function will take to perform 1000*1000 = 10 lac comparisons. ?
If it is to measure how much time a function takes to execute,
I followed this method, Obtain its corresponding assembly code, and add up the machine cycles of each instruction, multiplied by number of times it is executed.
"measure cycles"
Aren't you think its is bit complex and non accurate?
If I describe my micro controller perticulars, Philips states that LPC2130 can operate on Max 60 MHz Its MIPs [millions of instruction per secoand], is also 60.
So can I assusme that, each [of any type] ARM assembly instruction is being executed in one cycle.
Rightnow I need to consider 12 Mhz as operating clock freq.
Thanks in advance, Raj
Look up "Performance Monitor" in the uVision manual.
And at some point you'll have to learn the difference between "time complexity" and "actual runtime".