This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Analog to digital Converter (ADC) device

The analog signal is sampled at certain rate f.

Then each sample is converted to digital using successive approximation with 12 bits (SAR).

If each cycle in the conversion (sets the corresponding bit, comparing the value with the reference voltage and if necessary clears the bit) takes 30 microseconds, then what would be the the time spent in converting each sample to digital?

Parents
  • Generally a total conversion requires different phases. If you are asking how long does it take to make an actual conversion? This would be the time to start the conversion (write the SFR), the sample time (time to charge the internal sample capacitor), the conversion time (actual comparison to generate the bits), perhaps a self calibration and finally the binary value is written to the result SFR. Studying the data sheet of the device you are using should provide the details of the timing of an ADC. You didn't mention the device you are looking for so I couldn't help you to calculate it.

Reply
  • Generally a total conversion requires different phases. If you are asking how long does it take to make an actual conversion? This would be the time to start the conversion (write the SFR), the sample time (time to charge the internal sample capacitor), the conversion time (actual comparison to generate the bits), perhaps a self calibration and finally the binary value is written to the result SFR. Studying the data sheet of the device you are using should provide the details of the timing of an ADC. You didn't mention the device you are looking for so I couldn't help you to calculate it.

Children
No data