I've got a unexpected problem with Interrupts. Would be nice to discuss it.
I use Timer0 to have about 16000 fast interrupts (FIQ) per second. There is a AD-Conversation and a few lines of long long int aritmetic to compute. Inside this FIQ-routine there is a static counter. When it reaches 16000 it is reseted and a Software interrupt (SWI) is generated every second.
The SWI-routine has do to some computations and then to display it all on a special three wire conected display.
The programm does not work with 16000 Timer0 Interrupts per second. It is terminated by Abort-Mode. This comes a little unexpected to my. The effect is gone if i turn down FIQ frequency downto 100Hz. Is it possible that interrupts create a stack overflow?
Should i post my code?
My recommendation is to have the FIQ copy the relevant data once/second and then have the main loop - or a dedicated task - perform the display output. A three-wire interface to the display means that it will take somewhile to output the data. Leave that to the main loop.
Have you measured how long it take to emit the data to the display?
Have you measured how large % of your CPU power that is consumed just by the FIQ when running 16kHz? All time lost to the FIQ is extra calendar time needed for the display output unless you can issue a single transfer to the display and then let the hw handle all output.
You mention SWI. By that, I assumed that you meant the specific instruction used to enter the SVC mode. This is normally done by declaring functions like:
void __swi(0) my_handler(params); void __SWI_0 (params) { ... }
But I see now that you instead mean that you manaully assert a normal interrupt, and let the processor enter this interrupt as soon as it leaves the FIQ handler.
Then no SVC mode will be involved. However, have you made sure that the IRQ stack is large enough to support the printf() call?
Another thing: You start an ADC conversion in the FIQ handler, and then spend time waiting for the answer. Don't do that. Get the existing ADC value and then start a new transfer. Then the interval between the FIQ interrupts will be used for the conversion. 16000 times the dummy wait for a conversion is just lost CPU power - potentially more CPU power than you have...
Tamir: The transfer should not need to be interrupt-safe. The FIQ only performs the update once/second, and then issues a request to activate the reader. As long as the reader - the display output code - is guaranteed to finish within one second of being trigged, there should be no need for extra protection.
Per, you are right. I built my own scenario based on the text description, rather than to look carefully at the code.
Thanks you all for your help. :-) At this time my modified code works. I cannceled this SWI-stuff. I just put the 3-wire-display in the main-loop. The FIQ-routine sets a flag after 16384 ADC-samplings. The main-routine watches this flag, does a 3-wire-display-output and resets the flag.