This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

RTX on Cortex M3 applications

Hello,

I am evaluating RTX on various platforms. The first evaluation kit I tried was AT91SAM9263 and using examples and it went well. Since RTX example is not available for Cortex M3 (LM3S1968) applications, I used to common M config file (RTX_Config.h). However it doesn't proceed over os_sys_init (task1). Even after setting up the sysclktick (SysTickPeriodSet and SysTickEnable()) the os_sys_init enters IntDefaultHandler B area.

Please advise.

Parents
  • Hello yanki doodle,

    There are a lot of RTX examples for Cortex-M3 devices.

    If you are looking for a RTX example for a Luminary CM3 device please check .\Keil\ARM\Boards\Luminary\ek-lm3s6965\RTX_Blinky.
    I think this example can easily be ported to a LM3S1968 device.

    Best Regards,
    Martin Guenther

Reply
  • Hello yanki doodle,

    There are a lot of RTX examples for Cortex-M3 devices.

    If you are looking for a RTX example for a Luminary CM3 device please check .\Keil\ARM\Boards\Luminary\ek-lm3s6965\RTX_Blinky.
    I think this example can easily be ported to a LM3S1968 device.

    Best Regards,
    Martin Guenther

Children
  • Thanks Martin... I referred the example, SVC_Handler was not set and hence it was a problem. I am not getting IntHandler any more, however there is a problem with timings. If I continue in debug mode by stepping or using breakpoints, the tasks are switched as expected (ofcourse not in real time). In the debug mode if I run the code from beginning it visits a ticker task only once and runs into os_idle_demon task.

    The ticker task is as follows:

    __task void ticker(void)
    { while (1) { os_evt_set (0x1, t_phaseA); os_dly_wait(50); //os_evt_set (0x1, t_phaseB); //os_dly_wait(50); }
    }

  • Just to add that the code is running in flash and display is updated, meaning tasks are running however debugger (ULIN2) is not stopping on the breakpoint after expected delay.

    result = os_evt_wait_or(0x1, 10000);

    if (result != OS_R_TMO)
    { <<breakpoint>> RIT128x96x4StringDraw(&pucHello[ulCol++], 0, 0, 11);
    ....
    ....
    ....

    Any setting that I am missing which is required in RTX debugging?

  • Setting optimization level to zero does the trick, now breakpoints do take place.