This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

STM32F4 Input Capture - Timer resolution changes with optimization

Hi all,

I've recently set up an input capture channel on an STM32F4 processor.  I'm using Keil uVision 5.38a with the version 6.19 ARMClang compiler.  I'm using the TIM2 timer for input capture.  And slave mode is set to reset, so I'm measuring the interval between successive triggers.  Counter prescaler is 0 (and confirmed in the debugger) in all cases.

I noticed in the project that I was missing 4 bits of resolution in the timer.  The CCR2 register as viewed in the IDE showed a hex 'E' as the least-significant character, and this would never change, even though higher bits would jitter.  In order to debug, I changed the optimization settings and the problem went away.  I was getting the full resolution of the counter, and all values of the CCR2 lower nibble were available.  I originally had it set as -O3 with link-time optimization.  When I turned off link-time optimization, I would only see even numbers, so I would lose 1 bit of precision.  And when I went from -O3 to -O2, the problem went away completely.

I didn't think the compiler could affect the performance of the peripheral once configured.  Can anyone offer any insight regarding what's going on?  I'm concerned there may be other things I haven't discovered that may get compromised by the optimization settings.

Thanks,

Andy