I am running a 10KHz interrupt running some mathmatical DSP code. It has to complete within 100 microseconds and preferably less than 40 microseconds to give me spare processing power. I decided to run the code on the simulator and using the Performance Analyzer Window on the software simulator it said the interrupt routine was taking 33 microseconds and as taking a third of the overall CPU power. Great I thought! However when I run the code for real I found it was taking abou 3 times as long and there is barely any spare processing power for my other routines. Any ideas? Some technical information: - I am using PK166 V5.03 - Both the simulator and target are running at 40MHz. - The code is a deterministic block of maths, i.e. there are no time outs or waiting for I/Os. - The interrupt routine has the highest priority, also I checked on single stepping the target that no other interrupts are active. - I extensive tested the speed of the target code using a oscillscope and a spare I/O. I found that no single piece of the code is taking longer. The whole code just seems to run slower than the simulator by about 2 - 3 times. - I know the target is running at 40MHz because all the PWM frequencies and CAN frequencies are as expected.