Hi,
I am using Texas instrument MSC1210 microcontroller (with a clock frequency of 3.6864MHz). This controller has a 24 bit sigma delta ADC unit. I am trying to convert the raw ADC values from it into corresponding volts (or mvolts) signals but unable to do so successfully.
My code specifications are,
1) I am using Keil micro compiler version 4.02 and using C language for programming, with follwoing register configurations - ADMUX = 0x08; // Differential input from channels AIN0(channel 0) and AINCOM (channel 8)
- ADCON0=0x38; //internal voltage source on at 2.5 v, buffer on,PGA 1
- ADCON1=0x00; // bipolar, auto filter, no calibration
- DECIMATION=0x120; // corresponds to 200 samples/second
ACLK = 00; // corresponds to 200 samples/second
From theory, the below menitoned formual should give the correct ADC to signal (volts) conversion,
signal (volts) = ((Raw ADC Value)/ (2^24)) * 2.5V
But in practice the above formula is not giving me the right volts output and is giving a very small value.
While on the other hand I am using the data sheet formual to convert ADC to volts, as follows (from application note for ADC (SBAA097B page 13) of MSC1210, bits to volts conversion)
volts = ((N * Vref)/(GC*0.75*DEC^3*BG)) * RawADC + (Vref/(0.75*DEC^3*BG)) * Offset
OR
volt = K1 * RawADC + K2 * Offset................(1)
(N*Vref)/(GC*0.75*DEC^3*BG) = k1
(Vref/(0.75*DEC^3*BG)) = K2
Where (from the same application note)
N = 2^22 (for bipolar case and N = 2^21 for unipolar case)
Vref = 2.5 volts (reference voltage)
GC = 3143213 (gain calibration)
DEC = 288 (decimation for Vs located in Decimation = ADCON2:ADCON3 registers)
BG = 2^-2 (bit shift gain)
Offset = Offset Calibration (OC) register value = 13 (almost the same value of 13 comes up everytime during program execcution).
This evaluates the values of constant K1 and K2 as,
K1 = 0.0000007448132019 K2 = 0.0000005581632945
Hence equation (1) becomes,
volt = 0.0000007448132019 * RawADC + (0.0000005581632945 * 13) volt = 0.0000007448132019 * RawADC + 0.00007256122................(2)
If I use (2) in my calculation, the rms values of an AC signal is calculated very accurately always. But by right, ADC show give the PEAK (or instantaneous) values of the AC signal not the RMS values.And the ADC values never reaches to reflect the peak value of the AC signal, rather stay near the rms value of the signal.
**************** I have used both ways for ADC conversion but still not able to get the values.Please highlight if there is any thing incorrect I am doing in my calculations and/or registers configurations.
Asad
Well, too much to read but remember that the formula:
requires you to have converted (for example with a type cast) your raw ADC value into a floating point number, before the division. Because else you take the raw integer value and shift it right 24 steps. What will then be left to multiply with 2.5?
If you start by multiplying with 2.5, then you get your conversion to floating point. Then perform the division.
Or perform this as a fixed-point integer calculation, in which case you must also start by a multiplication - or a shift-left first before a division throws away the least siginificant bits and you end up with a final answer in mV or uV or whatever scaling you selected.
As for getting rms or top-value - that also depends on if there is a low-pass filter function in the analog section. With a fast ADC or a slow ADC with sample-hold, you can capture the top value.
View all questions in Keil forum