We are running a survey to help us improve the experience for all of our members. If you see the survey appear, please take the time to tell us about your experience if you can.
Anyone suggest me how to implement delay_microsecond in CMSIS RTOS?
here is my code
#include "cmsis_os.h" void start_alive_led(void const * argument) { /* USER CODE BEGIN start_alive_led */ /* Infinite loop */ for(;;) { HAL_GPIO_WritePin(alive_GPIO_Port,alive_Pin,GPIO_PIN_SET); osDelay(500); HAL_GPIO_WritePin(alive_GPIO_Port,alive_Pin,GPIO_PIN_RESET); osDelay(500); } /* USER CODE END start_alive_led */ }
the task/thread function works fine with 500ms
How to acheive delay in micro seconds - usec
i need to implement IR buster, so i need usec delay for each 1 and 0..
please help thanks
Don't use interrupt based delays, you almost certainly can't sustain a 1MHz rate. Use a free running timer, and delta the values. If you part supports DWT_CYCCNT use that to count CPU cycles.
Thanks for your kind reply.
I'm learning RTOS bit by bit.
can you please give me a sample code using "DWT_CYCCNT".
code from initilization untill delay time creation please.
im using STMF103 microcontroller.
Thank you for your help.