This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Timer Delay: CMSIS-RTOS RTX

Hello,

I have a problem with CMSIS-RTOS RTX.

My program works through the main()-function and after that it jumps to the Timer_Callback()-function. I didn't understand why I must define a timerDelay?

I thouhgt, that if I define a timerDelay for example 10.000 milliseconds, my program jumps every 10 seconds into the Timer_Callback()-Function, but it isn't so, it jumps every second into the Timer_Callback()-function.

I hope I could explain my problem and anyone can help me and I apologize in advance for my bad english.

Here is my source-code:

#include "LPC43xx.h"
#include "Board_GLCD.h"
#include "GLCD_Config.h"
#include "cmsis_os.h"
#include <stdio.h>

#define STRINGBUF_LEN 21
extern GLCD_FONT GLCD_Font_16x24;
char StringBuf[STRINGBUF_LEN];

void Timer_Callback(void const *arg);
osTimerDef(Timer, Timer_Callback);
uint32_t exec;

volatile int i = 0;

void Timer_Callback(void const *arg) {

        sprintf(StringBuf, "Access: %d", i);
        GLCD_DrawString(0, i*24, (char*)StringBuf);

        i++;
}


int main(void)
{
        osTimerId id;
        osStatus status;
        uint32_t timerDelay;

        GLCD_Initialize();

        GLCD_SetFont            (&GLCD_Font_16x24);
        GLCD_DrawString         (0, 0*24, "                    ");
        GLCD_DrawString         (0, 1*24, "                    ");
        GLCD_DrawString         (0, 2*24, "                    ");
        GLCD_DrawString         (0, 3*24, "                    ");
        GLCD_DrawString         (0, 4*24, "                    ");
        GLCD_DrawString         (0, 5*24, "                    ");
        GLCD_DrawString         (0, 6*24, "                    ");
        GLCD_DrawString         (0, 7*24, "                    ");
        GLCD_DrawString         (0, 8*24, "                    ");
        GLCD_DrawString         (0, 9*24, "                    ");

        exec = 2;

        id = osTimerCreate(osTimer(Timer), osTimerPeriodic, &exec);

        if(id != NULL) {
                timerDelay = 10000; // ???????????????????????????????????????????
                status = osTimerStart(id, timerDelay);

                if(status != osOK)
                        GLCD_DrawString(0, 8*24, "Timer: not started");
                else
                        GLCD_DrawString(0, 9*24, "Timer: started");
        }
}

Parents
  • The ADC register only has room for one result.

    So it is not any average but the result of the last conversion performed.
    If you start a conversion every second, then you get one snapshot of your input signal every second.

    If you start your conversion every 3.333ms, then the ADC register will get one snapshot value every 3.333ms.

    If you want averages, you either have to add a low-pass filter on the outside, or oversample and add multiple ADC conversion results into a local variable and then divide by the number of samples you added to that variable. Or implement a running-average logic where you keep an array with n samples and subtract the value of the oldest samples before replacing it with a new sample and adds the value of the new sample to your local variable.

    The ADC doesn't care about what task switch frequency you have. It only cares about how often you start a conversion. And how fast you clock the ADC, i.e. how fast after the conversion start the result will be available.

    Note that you can use an ISR to pick up the result as soon as the ADC has it available. Or if you start conversions from a timer ISR, you can extract the current ADC value and then start a new conversion and then return from the ISR. There will be one timer period lag between conversion start and extraction of the result, but that is normally not a problem.

Reply
  • The ADC register only has room for one result.

    So it is not any average but the result of the last conversion performed.
    If you start a conversion every second, then you get one snapshot of your input signal every second.

    If you start your conversion every 3.333ms, then the ADC register will get one snapshot value every 3.333ms.

    If you want averages, you either have to add a low-pass filter on the outside, or oversample and add multiple ADC conversion results into a local variable and then divide by the number of samples you added to that variable. Or implement a running-average logic where you keep an array with n samples and subtract the value of the oldest samples before replacing it with a new sample and adds the value of the new sample to your local variable.

    The ADC doesn't care about what task switch frequency you have. It only cares about how often you start a conversion. And how fast you clock the ADC, i.e. how fast after the conversion start the result will be available.

    Note that you can use an ISR to pick up the result as soon as the ADC has it available. Or if you start conversions from a timer ISR, you can extract the current ADC value and then start a new conversion and then return from the ISR. There will be one timer period lag between conversion start and extraction of the result, but that is normally not a problem.

Children