Im testing the uVision3 v8.16 IDE in connection with ADIs original ADuC814 eval board. When driving the DACs, everything works as long as IÂ'm NOT using printf.
Does printf reset the DAC settings or something like this ?
I inserted a condition so that fprint gets first used after some time, it works the first time until printf is used the first time, so i assume it is printf and not global stdio that does the effekt
Please see the small example below. When removing the printf command DAC0 will correctly output a sawtooth wave but not anymore when the printf is included
...
#include <stdio.h> #include <REG52.H>
sfr PLLCON = 0xD7; // PLL CONFIGURATION BYTE sfr DACCON = 0xFD; // DAC CONTROL REGISTER sfr DAC0L = 0xF9; // DAC0 DATA
unsigned char Output;
void main(void) {
PLLCON = 0x00; // CPU 16.0MHz Clock DACCON = 0xFF; // set both DACs: on, 08bit
while (1) {
DAC0L = Output; Output++;
printf("%02bX \n",Output); }
}
The program fragment I posted is part of a bigger code that does ADC and DAC interrupt driven. In the main loop I do nothing but just some printf output. I shrinked the code just to demonstrate the problem.
I have NO problem that printf basically takes quite a long time to execute, I just have a problem with the fact that after the first time printf is used my DAC output is "dead". Before the first printf the DAC output worx, even with the bit code I posted. I also tried to re-setup the DACCON again but it doesnt help.
I dont know if my serial output implementation polled or interrupt-driven, I guess polled, I did no setup for that since it worked fine just "out of the box". The printf output appears correctly. How is printf done when one does just a main(){printf("test")} ??
www.8052.com/.../161094
Problem was solved by shifting baudrate generation from T1 to T2.
New problem - fitting to the headline : when using simple putchar in the main loop there is no problem but when using fprint instead the mentioned I/O interrupt is slowed down significantly.
I have an interrupt several hundert times a second reading and writing external I/O and it works fine. The main program is just supervising stuff and putting it out on UART.
Is printf or sub-parts of it disabling interrupts during runtime ? As far as I have read printf is also just using putchar so there should be no such problem, right ?
I basically dont care how long printf needs to put stuff out since main things happen in the interrupt. So there is no execution time problem for me using printf, if it wouldnt slow down my I/O interrupt
My tests show that cycles inside printf definately slow down the my I/O interrupts. When I shorten the printf statement, the I/O interrupt gets called more often than with a long printf statement.
Strange. Any ideas ? I guess the printf source is not open, isnt it ??