Hi, guys! I got a problem. I had written a program in C51, but i don't know to calculate the time taken for this whole program. Is there any way to calculate my program ? Thank you!!!
Run it in the simulator and use the performance analyzer. Start a timer at the beginning, read it at the end. If the program takes longer than the timer can represent, write an interrupt handler to count timer ticks, and add the fraction of a tick that's left over at the end. If the program takes human-scale times and precision isn't an issue, just use a stopwatch. Toggle an I/O pin at the start and end and use a logic analyzer to measure the times between pulses. Send a character out the UART, and time it on your PC. Don't forget it'll take a millisecond to transmit at 9600 bps. Check the accuracy of the timers in your OS, too.
Unless you've somehow managed to write the entire program without the use of any conditional statements, the time it takes to "run" will be different each time - depending on which routes it takes through all the conditions. Therefore you will need to work out some strategy to obtain a representative set of "run" timings, and take some sort of average. eg, see: http://www.keil.com/benchmks/tm_c51_v7_small.asp BTW: This question is usually meaningless in embedded systems: the software starts when the equipment is started, and keeps running until it is switched off!