Why does osGetMessageQueue in a thread result in host buffer overflows?

I have multiple threads that try to read from a message queue via

osMessageQueueGet(MsgQueue, &msg, NULL, 0U)

The Event Recorder show me very frequent "Host Buffer Overflows". According to https://www.keil.com/support/man/docs/uv4/uv4_db_dbg_evr_view.htm this happens due to too frequent buffer writes, but it also happens when the queue is empty and nothing is actually written to the message buffer. When I set the timeout of osMessageQueueGet() to 1ms or more, the overflows get less fequent, but that also makes the algorithm less responsive.

Does someone have an idea why this happens? The code is running on a Cortex-M3. Is it too slow?

I followed the documentation and there it is described exactly as I have implemented it. The event switching doesn't look good.

Parents
  • Your code is spinning, which is causing too many messages for the debug link to handle. Find a way to slow down the calls to osMessageQueueGet(). For example, wait on an event to check the queue. Or, modify the event recorder message filter so that only the messages you want make it to the debugger. 

Reply
  • Your code is spinning, which is causing too many messages for the debug link to handle. Find a way to slow down the calls to osMessageQueueGet(). For example, wait on an event to check the queue. Or, modify the event recorder message filter so that only the messages you want make it to the debugger. 

Children
More questions in this forum