Im testing the uVision3 v8.16 IDE in connection with ADIs original ADuC814 eval board. When driving the DACs, everything works as long as IÂ'm NOT using printf.
Does printf reset the DAC settings or something like this ?
I inserted a condition so that fprint gets first used after some time, it works the first time until printf is used the first time, so i assume it is printf and not global stdio that does the effekt
Please see the small example below. When removing the printf command DAC0 will correctly output a sawtooth wave but not anymore when the printf is included
...
#include <stdio.h> #include <REG52.H>
sfr PLLCON = 0xD7; // PLL CONFIGURATION BYTE sfr DACCON = 0xFD; // DAC CONTROL REGISTER sfr DAC0L = 0xF9; // DAC0 DATA
unsigned char Output;
void main(void) {
PLLCON = 0x00; // CPU 16.0MHz Clock DACCON = 0xFF; // set both DACs: on, 08bit
while (1) {
DAC0L = Output; Output++;
printf("%02bX \n",Output); }
}
Problem was solved by shifting baudrate generation from T1 to T2.
New problem - fitting to the headline : when using simple putchar in the main loop there is no problem but when using fprint instead the mentioned I/O interrupt is slowed down significantly.
I have an interrupt several hundert times a second reading and writing external I/O and it works fine. The main program is just supervising stuff and putting it out on UART.
Is printf or sub-parts of it disabling interrupts during runtime ? As far as I have read printf is also just using putchar so there should be no such problem, right ?
I basically dont care how long printf needs to put stuff out since main things happen in the interrupt. So there is no execution time problem for me using printf, if it wouldnt slow down my I/O interrupt
My tests show that cycles inside printf definately slow down the my I/O interrupts. When I shorten the printf statement, the I/O interrupt gets called more often than with a long printf statement.
Strange. Any ideas ? I guess the printf source is not open, isnt it ??