This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

How to increase delay time in RTOS program?

Dear Sir/Madam
I configured my RTOS tick on 25us in order to read sensors fast. but in other hand i need to have for example 50 seconds delay in my program.So i use

void os_tmr_call(U16 ifo)

and

os_tmr_create(U16)

;U16 lenght is 65536 and as you know maximum delay is 65536 X 25us = 1.63 S if i will be able to use U32 in os_tmr_create problem will be solved but i can't.
Also i don't want to stop program in a line.I mean i don't want to use for example

for(i=0;i<50;i++)os_dly_wait(40000);

and just want to use virtual timer or something like that.

Could you please tell me how can i increase delay time to 50 S or more base on 25us tick ?

Parents
  • You do realize that a short time quanta for the OS is not the only way to control how handle fast sampling of a sensor?

    I'd say that having a really short time quanta is actually a quite bad idea.

    A much better route is to program a timer to generate a timer interrupt at your required frequency. Then in the ISR you start a sensor conversion. Let the sensor generate an interrupt when the data is available and pick up the result. Or pick up the result on the next timer interrupt just before you start the next conversion.

    Each time you do get an answer, you can place the data in a queue and have a thread pick it up. If you want it to be picked up quickly, you can trig an event. Or you can use a mailbox and send the data. Or, if there is no need to react instantly, you can have a thread poll the queue n times/second and see how many samples that are available for processing.

    Another thing to realize - sometimes a program might need to have delays that are longer than what a timer can handle. The trivial way to solve it is to use a software counter. So if the timer can measure up to 1 second, and you need a 30 second delay, you can wake up 30 times update the software counter - when it has reached 30 (or zero if you decrement) then your 30 seconds have passed.

    So don't expect to be able to create all your delays as a single delay. Much embedded applications may make use of a 1kHz timer interrupt. Each interrupt represents 1ms. But the ISR can increment a 1ms counter integer each time. When that counter reaches 1000, 1 second has passed. So the ISR can reset the 1ms counter, and instead increment a 1-second counter. And with a 32-bit integer for the 1-second counter, you can get that 1kHz interrupt to solve delays of over 100 years. Not enough? So count 24*3600 seconds, and then increment a 1-day counter. Then your program can use a 32-bit integer to keep track of a delay of over four billion days...

    Don't spend your time getting stuck. Spend your time looking for opportunities. Be creative. Experiment. Before we got our 8-bit microprocessors, people managed to go to the moon. So a 16-bit limitation for the OS delay function can hardly be seen as a big problem to overcome.

Reply
  • You do realize that a short time quanta for the OS is not the only way to control how handle fast sampling of a sensor?

    I'd say that having a really short time quanta is actually a quite bad idea.

    A much better route is to program a timer to generate a timer interrupt at your required frequency. Then in the ISR you start a sensor conversion. Let the sensor generate an interrupt when the data is available and pick up the result. Or pick up the result on the next timer interrupt just before you start the next conversion.

    Each time you do get an answer, you can place the data in a queue and have a thread pick it up. If you want it to be picked up quickly, you can trig an event. Or you can use a mailbox and send the data. Or, if there is no need to react instantly, you can have a thread poll the queue n times/second and see how many samples that are available for processing.

    Another thing to realize - sometimes a program might need to have delays that are longer than what a timer can handle. The trivial way to solve it is to use a software counter. So if the timer can measure up to 1 second, and you need a 30 second delay, you can wake up 30 times update the software counter - when it has reached 30 (or zero if you decrement) then your 30 seconds have passed.

    So don't expect to be able to create all your delays as a single delay. Much embedded applications may make use of a 1kHz timer interrupt. Each interrupt represents 1ms. But the ISR can increment a 1ms counter integer each time. When that counter reaches 1000, 1 second has passed. So the ISR can reset the 1ms counter, and instead increment a 1-second counter. And with a 32-bit integer for the 1-second counter, you can get that 1kHz interrupt to solve delays of over 100 years. Not enough? So count 24*3600 seconds, and then increment a 1-day counter. Then your program can use a 32-bit integer to keep track of a delay of over four billion days...

    Don't spend your time getting stuck. Spend your time looking for opportunities. Be creative. Experiment. Before we got our 8-bit microprocessors, people managed to go to the moon. So a 16-bit limitation for the OS delay function can hardly be seen as a big problem to overcome.

Children