We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
Hi everyone,
I am designing DC/DC converter for high frequency applications. I am using current close loop control to control the converter. Now, the Xc164Cs has in built ADC. I need to check the output voltage and the current as well as input voltage and the current of the converter. I just want to know "how quickly i need to measure the inputs?" and how can i check that?
Cheers
vivek
Are you going to let the processor be in the control loop, or just supervise?
You normally don't include the processor in the control loop of a DC/DC.
Hi
Yes, you are right. I just want to check it. I am not going to do the processor into close loop. just supervise it. I just want to check that how quickly I measure my signals? and how quickly I sampled each input?
Vivek
The information you are seeking can be found in the data sheet and users manual.
For an analog to digital conversion the fastest you can perform a conversion is from the following formulas:
conversion time (10-bit) = 40 * tBC + tS + 6 * tSYS conversion time (8-bit) = 32 * tBC + tS + 6 * tSYS
Conditions... 1) The maximum frequency of the ADC Base Clock is 20MHz. 2) The maximum SYStem clock is 40MHz 3) The minimum Simple Time = tBC * 4 * (<ADSTC> + 1) 4) Post calibration is off
Assumptions: fSYS = 40 MHz (i.e. tSYS = 25 ns), ADCTC = '01', ADSTC = '00' Basic clock fBC = fSYS / 2 = 20 MHz, i.e. tBC = 50 ns Sample time tS = tBC * 8 = 400 ns
Conversion 10-bit: With post-calibr. = 52*tBC + tS + 6*tSYS = (2600 + 400 + 150) ns = 3.15 us Post-calibr. off = 40*tBC + tS + 6*tSYS = (2000 + 400 + 150) ns = 2.55 us
Conversion 8-bit: With post-calibr. = 44*tBC + tS + 6*tSYS = (2200 + 400 + 150) ns = 2.75 us Post-calibr. off = 32*tBC + tS + 6*tSYS = (1600 + 400 + 150) ns = 2.15 us
Nyquist criterion my friend: sample rate = 2*frequency of signal to be sampled. Application: Universal