This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

ADC Help!!!

I did the configuration for the ADC and use a 2V voltage supply as an analog input. But the LCD shows nothing. The board im using is C8051F206. The integrated ADC is 12bit. Also, Im using Port 1.7 as the analog input.

MOV AMX0SL,#2FH     ; Selects P1.7 as the input
MOV ADC0CF,#000H    ; 1 system clock and 1 gain
MOV ADC0CN,#0C1H
MOV ADC0L, #000H    ; ADC Data Word Register
MOV ADC0H, #000H    ; ADC Data Word Register
MOV ADC0LTH, #000H  ; ADC Less-Than High Byte Register
MOV ADC0LTL, #000H  ; ADC Less-Than Low Byte Register
MOV ADC0GTH, #0FFH  ; ADC Greater-Than High Byte Reg
MOV ADC0GTL, #0FFH  ; ADC Greater-Than Low Byte Reg

CONVERT: SETB ADBUSY ; starts conversion LCALL DELAY POLL: JB ADCINT,PRINT ;Poll to see whether conversio is done SJMP POLL PRINT: CLR ADCINT CLR RS MOV DAT,#0FH ; On the LCD SETB EN LCALL DELAY CLR EN MOV A,ADC0H LCALL WRITE_TEXT RET

Parents
  • A 12-bit ADC means that there are 4096 steps. Each step represents 0.025% of full range. The voltage reference you use must have a similar precision if you want all bits of the ADC value to mean anything. If the voltage reference is +/- 1%, then you can build two units, and the two units can differ by up to 2%. That is 80 steps on the ADC.

    If you buy a multimeter and measure something, the values shown will not be correct. Depending on how the unit is calibrated, it may show too low or to high values. Changed temperature will also affect how much error there will be in the multimeter readings.

    There is no difference when using an ADC in a uC project. You either have to calibrate the unit (by taking a calibrated multimeter of higher precision) and measure input voltage and resulting voltage. Then you have to adjust for the difference. Or you have to give the ADC a voltage reference that are guaranteed to give a reference voltage within the tolerances you require.

    Yes, I know that you program in assembly. That was why I said: If you can afford. The amount of code for playing with 24-bit or 32-bit integers quickly grows, since the uC can't handle more than 8 bits at at time.

    In some situations it might be better to use a successive approximation for converting from the ADC steps into actual voltage, since then you can implement without multiply or divide. All you need is add and shift operations. They are easy to use even for multi-precision integers.

Reply
  • A 12-bit ADC means that there are 4096 steps. Each step represents 0.025% of full range. The voltage reference you use must have a similar precision if you want all bits of the ADC value to mean anything. If the voltage reference is +/- 1%, then you can build two units, and the two units can differ by up to 2%. That is 80 steps on the ADC.

    If you buy a multimeter and measure something, the values shown will not be correct. Depending on how the unit is calibrated, it may show too low or to high values. Changed temperature will also affect how much error there will be in the multimeter readings.

    There is no difference when using an ADC in a uC project. You either have to calibrate the unit (by taking a calibrated multimeter of higher precision) and measure input voltage and resulting voltage. Then you have to adjust for the difference. Or you have to give the ADC a voltage reference that are guaranteed to give a reference voltage within the tolerances you require.

    Yes, I know that you program in assembly. That was why I said: If you can afford. The amount of code for playing with 24-bit or 32-bit integers quickly grows, since the uC can't handle more than 8 bits at at time.

    In some situations it might be better to use a successive approximation for converting from the ADC steps into actual voltage, since then you can implement without multiply or divide. All you need is add and shift operations. They are easy to use even for multi-precision integers.

Children