Hello,
i have problems to get a 10-Bit result from the internal adc of the AT89C51CC03.
I always got only a 8-Bit result, though setting the PSIDLE-Bit. Also i did not need the adc-eoc-interrupt to continue after the conversion, as it is explained in the device manual.
So it seems to me, i made something wrong in setting up the adc for 10-Bit resolution.
I include the following code, which should not run with disabled eoc-interrupt. But it runs !!!
#include "AT89C51CC03.h" unsigned int i; void main(void) { ADCF = 0x20; /* configure channel P1.5(AN5) for ADC */ ADCLK = 0x00; /* init prescaler for adc clock */ ADCON = 0x20; /* Enable the ADC */ EA = 0; /* disable interrupts */ EADC = 0; P31 = 1; // signal running via txd pin for (i=0; i< 20000; i++); while(1) { ADCON &= ~0x07; /* Clear the channel field ADCON[2:0] */ ADCON |= 0x05; /* Select channel 5 */ ADCON |= 0x40; /* 10 bit mode */ ADCON |= 0x08; /* Start conversion */ P31 = ~P31; // signal running via txd pin while (!(ADCON & 0x10)); ADCON &= 0xef; for (i=0; i< 20000; i++); } }
So please contact me, if you have any ideas, hints, solutions, etc.
have considered to use a "driver" for the ADC hardware ? this works on AT89C51AC3 @29.4912MHz X1 mode. also it takes care of the watchdog. minor adjustments for the Keil compiler may be needed.
// ADC 0-3V 8bit Standard or 10bit Best Precision at port P1[7..0] __bit adc_eoc = 0; // ADC end of conversion volatile unsigned char adc_val=0x00; // ADC value 8bit Standard Precision volatile unsigned int adc_value=0x00; // ADC value 10bit conversion Best Precision void ADC_ISR(void) __interrupt(ADC_VECTOR) __using (3) { ADCON &= ~ADEOC; // Set by Hardware at end of conversion. Clear by software adc_val = ADDH; // ADDH contains 8bit standard value of ADC adc_value = ADDH << 2; // ADDH:ADDL contains 10 bit value of ADC adc_value |= (ADDL & 0x03); // mask ADDL[1..0] lower bits best precision adc_eoc = 1; // Set adc_eoc semaphore to inform end of conversion } void initadc(unsigned char ADCFchannels) // setup P1 pins for ADC. call: initadc(CH6|CH7) 8BITS=8CHANNELS { ADCON = 0x00; // clear ADCF = 0x00; // clear all channels ADCF |= (ADCFchannels); // set CH6, CH7 as Analog Inputs ADCLK = 0x00; // ADCLK[4..0] PRS4..0 = 0 => X1 =Fxtal/128 // F_XTAL = 29.491200 MHz / 128 = 230.400 KHz ADCON |= ADEN; // ADCON.ADEN = Enable (Disable for low power consumption) // Tsetup =4usec before the first convertion EADC = 1; // IEN1.EADC Enable ADC Interrupt // EA=1 global interrupt enable should handled in main } unsigned char adcreadstd (unsigned char ADCONchannel) // Standard Precision call by channel to be converted 0..7 { ADCON &= ~(SCH2|SCH1|SCH0); // clear all SCH clannels 3BITS=8CHANNELS ADCON |= ADCONchannel; // Select channel ADCON.[SCH2..0] ADCON &= ~PSIDLE; // Standard converion 8bit clear ADCON.PSIDLE=0 ADCON |= ADSST; // Start conversion. Cleared by hardware after completion. while (!adc_eoc) resetWDT(); return (adc_val); } unsigned int adcreadpre (unsigned char channel) // Best Precision call by channel to be converted 0..7 { ADCON &= ~(SCH2|SCH1|SCH0); // clear all SCH clannels ADCON = channel; // Select channel ADCON.[SCH2..0] ADCON |= (PSIDLE|ADSST); // Best Precision convertion 10bit ADCON.PSIDLE=1, ADCON.ADSST=1 // PSIDLE stop MCU but not peripherals. The ADC_ISR wakes system while (!adc_eoc) resetWDT(); return (adc_value); }
thank you for your answer.
i tried your code, but i always got only an 8-Bit result. (ADDH toggles +- 1LSB, and ADDL toggles from 0..3) I think with 10 Bit precision, ADDH should be stable.
Also the MCU does not stop, if PSIDLE Bit is set. So the ADC seems to be not in the 10Bit mode, or this mode does not work ?
I have changed one line in adcreadpre() : ADCON |= channel; // Select channel ADCON.[SCH2..0]
i tried your code, but i always got only an 8-Bit result. No. You appear to have somewhat strange ideas of what makes a result a 10-bit one. You appear to be talking about the values of 10 different bits, so what exactly is 8-Bit about that?
(ADDH toggles +- 1LSB, and ADDL toggles from 0..3) I think with 10 Bit precision, ADDH should be stable.
That thought is incorrect. You're talking about a digital value here. All bits can change for a 1-bit change in the value: 0x1ff -> 0x200
The accuracy of the ADC value either 8bit or 10bit is affected by the hardware conditions on this specific board you use.
VSS as GND should be solid. VCC as Supply Voltage properly distributed on the board. VAREF Reference Voltage for ADC should be obtained from a stabilized source neutral to drifts. VAGND Reference Ground properly connected on pcb from the analog section to digital section. Power Supply with very small ripple and maximum noise rejection. Board construction for better results have to be built with 4 layers. Noisy signal path should be avoided. Undesired capacitance on signal probes.
1 bit at VAREF 2.5V 10 bit resolution means 2,44mV 3 bit means 19.5mV span
Precision is different from Resolution.
To overome the lower bits change you can oversample the desired signal eg for 32 times and then shift right 5 times the added measurements (or divide by 32). Another typical use is to construct a low pass digital filter using 5 integers to store the intermediate results. and so on ...
But i doesn't see any difference in the result for 8Bit and 10Bit conversion !
And the adc conversion for 10Bit runs also without the so called required Interrupt ADC-EndOfConversion ! In the device manuals it seems to be necesssary.
So i ask me what is wrong ?