Analog to digital Converter (ADC) device

The analog signal is sampled at certain rate f.

Then each sample is converted to digital using successive approximation with 12 bits (SAR).

If each cycle in the conversion (sets the corresponding bit, comparing the value with the reference voltage and if necessary clears the bit) takes 30 microseconds, then what would be the the time spent in converting each sample to digital?

More questions in this forum