We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
Hi everyone,
I am designing DC/DC converter for high frequency applications. I am using current close loop control to control the converter. Now, the Xc164Cs has in built ADC. I need to check the output voltage and the current as well as input voltage and the current of the converter. I just want to know "how quickly i need to measure the inputs?" and how can i check that?
Cheers
vivek
The information you are seeking can be found in the data sheet and users manual.
For an analog to digital conversion the fastest you can perform a conversion is from the following formulas:
conversion time (10-bit) = 40 * tBC + tS + 6 * tSYS conversion time (8-bit) = 32 * tBC + tS + 6 * tSYS
Conditions... 1) The maximum frequency of the ADC Base Clock is 20MHz. 2) The maximum SYStem clock is 40MHz 3) The minimum Simple Time = tBC * 4 * (<ADSTC> + 1) 4) Post calibration is off
Assumptions: fSYS = 40 MHz (i.e. tSYS = 25 ns), ADCTC = '01', ADSTC = '00' Basic clock fBC = fSYS / 2 = 20 MHz, i.e. tBC = 50 ns Sample time tS = tBC * 8 = 400 ns
Conversion 10-bit: With post-calibr. = 52*tBC + tS + 6*tSYS = (2600 + 400 + 150) ns = 3.15 us Post-calibr. off = 40*tBC + tS + 6*tSYS = (2000 + 400 + 150) ns = 2.55 us
Conversion 8-bit: With post-calibr. = 44*tBC + tS + 6*tSYS = (2200 + 400 + 150) ns = 2.75 us Post-calibr. off = 32*tBC + tS + 6*tSYS = (1600 + 400 + 150) ns = 2.15 us