This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

CMSIS CAN Driver SetBitrate() call

Hi,

I am reading the CMSIS doco for ARM_CAN_SetBitrate() and have the following question concerning the parameters...

The "bitrate" parameter according to the documentation defines the bitrate.

The "bit_segments" parameter sets the number of Tq making a bit time... and how the bit is divided up into the segments.

Nevermind I answered my own question by formulating it... I think. However if someone would like to confirm the following it would be good... since it still seems backwards to me.

--- The time quanta is not defined directly but is obtained as the result of dividing the total bit time by the number of Tq making up the segments + 1 for the sync seg. ---

( very backwards... as Tq is provided by the hardware and it would make sense ( to me ) to start calculation from there )

Thanks,
Tony

  • Well, it really all comes down to specifying bit time and sample point.

    This can be done in multiple ways, and this is the way CMSIS CAN driver specifies it through bitrate and segments.

    Regarding "Tq is provided by the hardware" it really depends on clock settings and divider settings so to make things simpler for the user idea is that divider is set by the driver to produce required sample point so Tq is calculated by the driver, as for the user Tq is irrelevant.

    The function GetClock is provided for user to see how many clocks is the maximum range for segments setting and to get number of segments that are divisible without remainder so bit time would be exact.