This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Timer Delay: CMSIS-RTOS RTX

Hello,

I have a problem with CMSIS-RTOS RTX.

My program works through the main()-function and after that it jumps to the Timer_Callback()-function. I didn't understand why I must define a timerDelay?

I thouhgt, that if I define a timerDelay for example 10.000 milliseconds, my program jumps every 10 seconds into the Timer_Callback()-Function, but it isn't so, it jumps every second into the Timer_Callback()-function.

I hope I could explain my problem and anyone can help me and I apologize in advance for my bad english.

Here is my source-code:

#include "LPC43xx.h"
#include "Board_GLCD.h"
#include "GLCD_Config.h"
#include "cmsis_os.h"
#include <stdio.h>

#define STRINGBUF_LEN 21
extern GLCD_FONT GLCD_Font_16x24;
char StringBuf[STRINGBUF_LEN];

void Timer_Callback(void const *arg);
osTimerDef(Timer, Timer_Callback);
uint32_t exec;

volatile int i = 0;

void Timer_Callback(void const *arg) {

        sprintf(StringBuf, "Access: %d", i);
        GLCD_DrawString(0, i*24, (char*)StringBuf);

        i++;
}


int main(void)
{
        osTimerId id;
        osStatus status;
        uint32_t timerDelay;

        GLCD_Initialize();

        GLCD_SetFont            (&GLCD_Font_16x24);
        GLCD_DrawString         (0, 0*24, "                    ");
        GLCD_DrawString         (0, 1*24, "                    ");
        GLCD_DrawString         (0, 2*24, "                    ");
        GLCD_DrawString         (0, 3*24, "                    ");
        GLCD_DrawString         (0, 4*24, "                    ");
        GLCD_DrawString         (0, 5*24, "                    ");
        GLCD_DrawString         (0, 6*24, "                    ");
        GLCD_DrawString         (0, 7*24, "                    ");
        GLCD_DrawString         (0, 8*24, "                    ");
        GLCD_DrawString         (0, 9*24, "                    ");

        exec = 2;

        id = osTimerCreate(osTimer(Timer), osTimerPeriodic, &exec);

        if(id != NULL) {
                timerDelay = 10000; // ???????????????????????????????????????????
                status = osTimerStart(id, timerDelay);

                if(status != osOK)
                        GLCD_DrawString(0, 8*24, "Timer: not started");
                else
                        GLCD_DrawString(0, 9*24, "Timer: started");
        }
}

  • Check "Timer clock value [Hz]" in RTX_Conf_XC.c file. Maybe your SysTick timer does not run correctly?

  • Thank you for your answer!

    Do you mean this:

    RTX_Conf_CM.c

    // <h>RTX Kernel Timer Tick Configuration
    // ======================================
    //   <q> Use Cortex-M SysTick timer as RTX Kernel Timer
    //    Cortex-M processors provide in most cases a SysTick timer that can be used as
    //    as time-base for RTX.
    #ifndef OS_SYSTICK
     #define OS_SYSTICK     1
    #endif
    //
    //   <o>RTOS Kernel Timer input clock frequency [Hz] <1-1000000000>
    //    Defines the input frequency of the RTOS Kernel Timer.
    //    When the Cortex-M SysTick timer is used, the input clock
    //    is on most systems identical with the core clock.
    #ifndef OS_CLOCK
     #define OS_CLOCK       12000000
    #endif
    
    //   <o>RTX Timer tick interval value [us] <1-1000000>
    //    The RTX Timer tick interval value is used to calculate timeout values.
    //    When the Cortex-M SysTick timer is enabled, the value also configures the SysTick timer.
    //    Default: 1000  (1ms)
    #ifndef OS_TICK
     #define OS_TICK        1000
    #endif
    

    But I haven't changed anything here before, is it normally, that anything will be wrong here?

  • If you started a new project then these values are set to predefined default values. I seriously doubt that you run your LPC43xx at 12MHz as is defined with #define OS_CLOCK 12000000

    Check your clock settings (system_xxx.c file) and then modify OS_CLOCK value accordingly.

  • Do you mean in this informations in system_LPC43xx.c?

    /*----------------------------------------------------------------------------
      This file configures the clocks as follows:
     -----------------------------------------------------------------------------
     Clock Unit  |  Output clock  |  Source clock  |          Note
     -----------------------------------------------------------------------------
       PLL0USB   |    480 MHz     |      XTAL      | External crystal @ 12 MHz
     -----------------------------------------------------------------------------
        PLL1     |    180 MHz     |      XTAL      | External crystal @ 12 MHz
     -----------------------------------------------------------------------------
        CPU      |    180 MHz     |      PLL1      | CPU Clock ==  BASE_M4_CLK
     -----------------------------------------------------------------------------
       IDIV A    |     60 MHz     |      PLL1      | To the USB1 peripheral
     -----------------------------------------------------------------------------
       IDIV B    |     25 MHz     |   ENET_TX_CLK  | ENET_TX_CLK @ 50MHz
     -----------------------------------------------------------------------------
       IDIV C    |     12 MHz     |      IRC       | Internal oscillator @ 12 MHz
     -----------------------------------------------------------------------------
       IDIV D    |     12 MHz     |      IRC       | Internal oscillator @ 12 MHz
     -----------------------------------------------------------------------------
       IDIV E    |    5.3 MHz     |      PLL1      | To the LCD controller
     -----------------------------------------------------------------------------*/
    
    /*----------------------------------------------------------------------------
      Clock source selection definitions (do not change)
     *----------------------------------------------------------------------------*/
    #define CLK_SRC_32KHZ       0x00
    #define CLK_SRC_IRC         0x01
    #define CLK_SRC_ENET_RX     0x02
    #define CLK_SRC_ENET_TX     0x03
    #define CLK_SRC_GP_CLKIN    0x04
    #define CLK_SRC_XTAL        0x06
    #define CLK_SRC_PLL0U       0x07
    #define CLK_SRC_PLL0A       0x08
    #define CLK_SRC_PLL1        0x09
    #define CLK_SRC_IDIVA       0x0C
    #define CLK_SRC_IDIVB       0x0D
    #define CLK_SRC_IDIVC       0x0E
    #define CLK_SRC_IDIVD       0x0F
    #define CLK_SRC_IDIVE       0x10
    
    
    /*----------------------------------------------------------------------------
      Define external input frequency values
     *----------------------------------------------------------------------------*/
    #define CLK_32KHZ            32768UL    /* 32 kHz oscillator frequency        */
    #define CLK_IRC           12000000UL    /* Internal oscillator frequency      */
    #define CLK_ENET_RX       50000000UL    /* Ethernet Rx frequency              */
    #define CLK_ENET_TX       50000000UL    /* Ethernet Tx frequency              */
    #define CLK_GP_CLKIN      12000000UL    /* General purpose clock input freq.  */
    #define CLK_XTAL          12000000UL    /* Crystal oscilator frequency        */
    

  • Okey, it is right, if I set OS_CLOCK to 180 MHz as

    CPU      |    180 MHz     |      PLL1      | CPU Clock ==  BASE_M4_CLK
    

    Is this my SystemClock (CPU = 180 MHz)?

  • Yes, this is also the clock frequency of the SysTick timer ;)

  • Thank you very much, for your help!

    By the way, do you know how can I find out, how fast is my AD-Converter.

    Because I want to show my AD-Values on the display as a diagramm. The y-axis is my AD-Value and the x-axis is my time. And for this, I must know how fast my ADC converts the value. And can I change this speed (conversion rate) of the AD-Converter or is this a fix value?

    /* Configure ADC0_2 */
      LPC_ADC0->CR = (1 <<  2) |            /* Select ADC0_2 pin for conversion   */
                     (2 <<  8) |            /* 12MHz / (2+1) = 4MHz               */
                     (1 << 21) ;            /* ADC is operational                 */
    

    I'm not sure, if this value (12MHz / 3 = 4 MHz) is my conversion rate of my AD-Converter?

  • The datasheets tends to tell how many clock cycles an ADC needs to do conversions to different precision. And how slow/fast you may configure the ADC clock. That should give you an indication how many conversions you can get per second.

  • I have a look in the datasheet and get the following informations:

    - ADC clock frequency is 4.5 MHz
    - Sampling frequency:   --> 10-bit resolution; 11 clock cycles: 400 kSamples/s
                            -->  2-bit resolution;  3 clock cycles: 1.5 MSamples/s
    

    In the user manual I looked for the configuration of the A/D Control register (LPC_ADC0->CR) and get the following informations:

    Bit 15:8  CLKDIV
    
    Bit 19:17 CLKS     - 0x0   11 clocks / 10 bits
                       - 0x1   10 clocks /  9 bits
                       - 0x2    9 clocks /  8 bits
                       - ...
                       - 0x7    4 clocks /  3 bits
    

    I have this configuration in my sourcecode now:

    #define ADC_RESOLUTION        10        /* Number of A/D converter bits       */
    
      /* Configure ADC0_2 */
      LPC_ADC0->CR = (1 <<  2) |            /* Select ADC0_2 pin for conversion   */
                     (2 <<  8) |            /* 12MHz / (2+1) = 4MHz               */
                     (1 << 21) ;            /* ADC is operational                 */
    

    My questions:
    I have a 10-bit AD-Converter. So I have to set bit 19:17 (CLKS) to 0x0. And with this information I know, that my sampling frequency is 400 kSamples/s. It's right?

    In the user manual is this information for CLKDIV:

    The ADC clock is divided by the CLKDIV value plus one to produce the clock
    for the A/D converter, which should be less than or equal to 4.5 MHz. Typically,
    software should program the smallest value in this field that yields a clock of
    4.5 MHz or slightly less, but in certain cases (such as a high-impedance analog
    source) a slower clock may be desirable.
    

    My english is not so gut, but I understand in this text only, that my ADC clock frequency should be less or equal to 4.5 MHz. The default configuration is 12 MHz and I must divide this value, it's right? So, that the resolution of the division should be equal or less than 4.5 MHz. I can divide it for example with 5? Then I have a ADC clock frequency of 2.4 MHz (12MHz / 5 = 2.4 MHz), but I can't divide it with 2, because then I get a ADC clock frequency of 6 MHz (12 MHz / 2 = 6 MHz), and this is not allowed. Have I understand this correctly?

    My last question is, what's the difference of sampling frequency and the ADC clock frequency?
    Which of this is my conversion rate of the A/D-Converter?

  • To my last question:

    I read that the sampling frequency is, how many times my signal is sampled. For example, if I have a signal with 100Hz and a sampling frequency of 1kHz, so my signal sampled 10 times or rather 10 Samples/ms.

    Is the ADC clock frequency my conversion rate, properly speaking how slow/fast the analog signal is converted into a digital value?

  • The difference between clock frequency and sampling frequency?

    Sampling frequency is the number of final results you produce.
    Clock frequency is the clock signal you feeds to the ADC.

    With a high clock frequency you either get a higher sampling frequency/rate.
    Or you get a quick conversion and can then let the ADC idle until it's time to request the start of a new conversion. So having a higher clock frequency doesn't mean that you must perform more conversions per second.

    Yes - the clock frequency decides how fast the ADC converts the data. So if you know the clock frequency and the number of clock cycles, you can then compute the number of microseconds it will take for one conversion.

    If your sampling frequency is 1kHz then you will get 1000 samples/second or 1 sample/ms. So you will sample at 10 times the frequency of your input signal. This doesn't capture a perfect curve shape but will give a decent indication of the curve shape of the input signal - if we assume that your 100Hz signal has a base frequency of 100Hz but has overtones, i.e. isn't a perfect sine wave. The more samples you get for every period of the input signal, the better you can capture the amount of overtones and how the overtones affects the signal shape.

    For a digital oscilloscope, you would normally want a sampling frequency >= 20 times the frequency of the signals you want to capture.

    But once more - the clock frequency to your ADC must be much higher than the sampling frequency of the ADC. And the user manual for the processor specifies the minimum frequency needed for a specific sampling rate with a specific number of converted bits.

  • Thank you very much for your answer! Now, I can better understand the ADC.

    I have a circuit with a bessel-filter and a cutoff frequency of 15Hz.

    So for a maximal 15Hz signal, I need minimum 300 Samples/s (15Hz * 20 = 300 Samples/s).

    My ADC has a 10-bit resolution and needs therefore 11 clock cycles. This means, that my ADC clock frequence is 3kHz (300 Sample/s * 10 = 3000 Hz = 3kHz)

    If I want to configure the ADC clock frequency, I must divide 12MHz by 4000 to get a 3kHz ADC clock frequency (12MHz / 4000 = 3kHz). Is this right?

  • Note that for 300 samples/second, you need a _minimum_ of 300*11 = 3.3kHz. But you can also use 50kHz or 1MHz clock frequency to the ADC. Your allowed range is [3.3kHz .. 4.5MHz]

    And it's normally not recommended to run the ADC clock at too low frequency because the ADC internally has a sample-and-hold circuit and a very low clock frequency means a very slow conversion which means that the capacitor in the sample-and-hold will have time to slowly discharge before the conversion ends.

    So it's normally better to run the ADC at a much higher clock frequency and then have a timer ISR start a one-shot conversion every 3.33ms to get your 300 samples/second.

  • Okey, now I know it. Thank you very much!

  • ...
    ADC_StartConversion();
    while(ADC_ConversionDone() < 0);
    adcValue = ADC_GetValue();
    ...
    

    I got a adcValue = 1023. This is the converted value of my ADC. But is this one point on my signal?

    For example, if my ADC has a sampling frequency/rate of 300 Samples/s and my OS_CLOCK is 1 seconde too and my signal is a sinus wave, is the value = 1023 one sampling/point on my analog signal or which means the value 1023? I know it's a converted digital value, but I can't imagine it on my signal.

    Because I got every second one value (1023, 1015, 1025, ...) and my sampling frequency is 300 Samples/s? How should I understand this?