I am trying to implement a high precision timer using AT89S8252 using the following code: void configureTimer0(void) { EA = 0; TMOD &= 0xf0; TMOD |= 0x01; ET0 =1; TH0 = TH_VAL TL0 = TL_VAL TR0= 1; EA = 1; } // timer 0 interrupt handle void timer0 (void) interrupt 1 { TH0 = TH_VAL; TL0 = TL_VAL; P2_1 = !P2_1; switch(val) { case 1: // code break; case 2: // code break; } } void main(void) { configureTimer0(); while(1); } The problem occurs when I try to insert any code inside the Timer 0 interrupt handle, such as a blinking LED ar a simple switch statement. The code inside interrupt handle was as simple as possible. In order to keep a constant rate, I have to use a digital osciloscope to adjust TH_VAL and TL_VAL. Is there any other way to implement this ? Do I have to constantly calibrate TH_VAL and TL_VAL after inserting any piece of code inside the timer interrupt handle ? Best regards, Andre
"1) The interrupts (according to the manual) always execute the fitst instruction." Quoting from the manual: "Note that if an interrupt of higher priority level goes active prior to S5P2 of the machine cycle labeled C3 in Figure 20 [first machine cycle of LCALL to the ISR], then in accordance with the above rules it will be vectored to during C5 and C6 [the two machine cycles following the first LCALL], without any instruction of the lower priority routine having been executed." "2) access to IE and RETI acn not be interrupted." Correct, but only if an access to IE/IP or a reti is the instruction actually in progress. A higher priority interrupt occuring during the previous instruction will still be vectored to at the end of that instruction. I only know of two ways to maintain an accurate clock using timer interrupts: 1) Use hardware autoreload. 2) Choose a software reload value that doesn't require the timer to be stopped or the low byte to be reloaded. Stefan