This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Best way to control a servo?

Hi,

This might be a little OT but I've been exploring a few new ways of controling a servo.

I'm using a 8051 board of my own design, a Hitec HS300BB servo and the control line of the servo is connected to a digital out line.

Up untill now I've been using code similar to this:

sbit P1_1 = P1^1;
unsigned int wait;

P1_1 = 1;
for(x = 0; x <= wait; x++);
P1_1 = 0;
for(x = 0; x <= wait; x++);

This solution works but it isn't a very nice way of doing it cause it's strongly depended on lots os variables like overhead and processing power.

For my question, what is considerd to be the best way of controlling a servo which isn't as sensitive for overhead? (with the material I have ofcourse) Some example code would be apriciated.

TIA,
Mike


  • A more general solution you might like would be to program one of the timers to correspond to your "wait" interval, and toggle the output bit when the interrupt occurs. Keil's app note #105 talks about a general timer tick interrupt.

    http://www.keil.com/appnotes/files/apnt_105.pdf

    If the delays you need are really short (a few instruction cycle times), then you might need a instruction-loop delay. See, for example, the recent thread:

    http://www.keil.com/forum/docs/thread2150.asp

    Your particular part might include some PWM hardware to make your life even easier by driving an output pin for you according to some timer parameters.

    If you're relying on timing with an instruction loop to be precise, then you might want to keep interrupts masked during the loop, so that you don't lose any extra time to the interrupt handler.


  • Here's what I figured out using a Timer 2 example I found on the Keil site:

    void init_timer2 (void) {
    	/*--------------------------------------
    	Set the reload values to be 20000 clocks.
    	--------------------------------------*/
    	CRCL = (65535UL-0000UL);
    	CRCH = (65535UL-20000UL) >> 8;
    
    	CCEN = 0x08;
    
    	CCL1 = (65535UL-1700UL);
    	CCH1 = (65535UL-1700UL) >> 8;
    
    	TL2 = CRCL;
    	TH2 = CRCH;
    
    	/*--------------------------------------
    	Set Timer2 for 16-bit auto-reload.
    	The timer counts to 0xFFFF, overflows,
    	is reloaded, and generates an interrupt.
    	--------------------------------------*/
    	T2CON = 0x11;                /* 0XX10001 */
    
    
    	/*--------------------------------------
    	--------------------------------------*/
    	ET2 = 1;                      /* Enable Timer 2 Interrupts */
    	EAL = 1;                      /* Global Interrupt Enable */
    
    	P5 = 0x18;
    }
    
    void timer1_ISR (void) interrupt 5 {
    	TF2 = 0;
    }

    I've set it to reload every 20000 systemticks (1MHz / 20Khz = 50Hz) which should generate a interupt every 20 ms.

    So far everything okay... But I have to generate a pulse of 1.7 ms to center the servo, 8.5 ms to go absolute left and 2.5 ms to go absolute right.

    Am I doing something wrong or is this a defective servo?

  • Borrow a Digital Storage Oscilloscope (DSO), then you can check exactly what sort of pulse your chip is outputting.
    This avoids the awkward "two unknowns" problem. ("Is it my software, or is it the hardware?")