I have some questions about correct use of the CMSIS DSP library call arm_fir_32. First, I'll provide some background about what I am doing and what the setup is.
I have a STM32F4 Discovery board, using IAR EWARM for programming. Just for testing purposes, I'm generating a low frequency test signal at 40Hz and feeding it into one of the ADC inputs. The signal is biased to swing from 0 to about 2.5Vpp. The signal has a low to moderate amount of broadband noise - but at this point I am not purposely mixing or introducing any other signals with it. There is a timer interrupt set to sample frequency of 2KHz, with a sampling buffer of 2048 samples.
I have already tested and am using the FFT function arm_cfft_f32, and can accurately determine (track) the frequency of the incoming signal when I change it at the source. This seems to be working well.
Now, I would like to use the arm_fir_32 filter. To do this, I started out reading the documentation from CMSIS on the function. To implement a FIR low pass, to generate the tap coefficients, I am using this website's only tool to do so.
I generated a 4th order filter, and set the sampling rate the same as my software, with a cutoff of 60Hz. I forced generation to 24 taps to be even. So the BLOCK_SIZE is 32, and the number of blocks is 1024/32 = 32.
Following the example from CMSIS on this function, I believe I've set up correctly. So the chain looks like this:
ADC --> FIR --> FFT
However, I'm not getting the result I would expect. The values returned from the FFT's output buffer are exponentially large (not this way if I comment out /circumvent the FIR calls). This leads me to believe I am missing a step. Do I need to normalize the values? I thought that because I input the rate into the FIR function setup, this wouldn't be required - but maybe this is incorrect.
Can someone please provide some insight or assistance as to what I am missing or doing incorrectly to apply the FIR processing?
Still something bizarre going on with this - sorry to open this question again.
I've got the timer interrupt generating at these intervals:
TIM_TimeBaseStructure.TIM_Prescaler = 50;
TIM_TimeBaseStructure.TIM_Period = 1605; //timer interrupts at 1ms = 1Khz
use above, or OR use:
TIM_TimeBaseStructure.TIM_Period = 800; //timer interrupts at close to 500us = 2KHz
The main loop runs until loopCtr == BUFFER_SIZE then runs the FFT.
Below, using these defines gives me bins with 0.5Hz resolution. (A frequency of 25 Hz processed has high bin at #50)
#define SAMPLE_FREQ 1000
#define BUFFER_SIZE 2048
#define FFT_SIZE 1024
So why do these defines give me just 1 Hz resolution? (A frequency of 25 Hz processed has high bin at #25)
#define SAMPLE_FREQ 2000
#define BUFFER_SIZE 2048
The only thing that changes between those defines is the SAMPLE_FREQ. Also, I am at a loss to understand why the first set provides 0.5Hz resolution at all.
Because according to DSP theory, Fs / FFTsize = bin resolution.
What am I missing here?
No, I don't expect you to debug the program. I'm trying to figure out where the issue is, nothing more.
I am sending you the source files for the project directory in an RAR file. I'm confident that when you look at the code, you'll see there is no other code,
it's as simple as I have described, and hopefully you can tell me why the CMSIS call using those defines in question is not returning expected values.
I agree it should be Fs / FFTsize = bin resolution.
So in your first case
resolution should be 0.9765625Hz, with the 25Hz signal showing up mostly in bins 25 and 26
and the second case
resolution should be 1.953125Hz, with the 25Hz signal showing up mostly in bins 12 and 13.
I'm not sure why you're getting roughly double that. I have an inkling this might have more to do with your other code than CMSIS. I feel that it might be appropriate at this point to let you know that I'm here to support you with CMSIS related questions, not to debug any bug in your code you come across. If you're confident that it's CMSIS that is misbehaving in some way, or you're unsure how to correctly use a CMSIS function, please do let me know and I will work through it with you to resolve the issue. As always, I need as much information as possible to debug it; preferably a working example program that involves only CMSIS and nothing else.
edited which bins it should show up in because I fail at basic math
(Hope I deleted that last post before you read it. Thought I might have found problem...but not so)
An update as to what I've also tried. I'm now setting the timer interrupt this way, using the prescaler to main divide and leaving the period at 1000. It works,
as long as the product is close to 82500. (my Discovery board is off a bit from the 84000 it should be)
I've found that these setting provide 0.5Hz resolution - although they should not. (24Hz bin at 48) It seems to be related to the buffer size. It updates at 2 seconds, which is
what would be expected given the sample frequency and buffer size:
#define SAMPLE_FREQ 1000
These setting behave quite differently. The 24 Hz bin is at 24, 1 second resolution, and the update rate is 1 second.
#define BUFFER_SIZE 1024
#define FFT_SIZE 512
Hope this additional info helps discover why the FFT is behaving this way...thanks again for looking into this.
NB: On my Discovery board, I installed the LSE crystal and caps to get sub-second time stamps so I could measure the processing intervals. That's the reason the RTC stuff is there (at this point), for nothing else.
Forget it then...
Clearly, there's an issue with the FFT at specific rates, buffer sizes and FFT's. Would be nice if it was debugged before releasing. I'll be sure to post here and let you know how SIGLIB works.
Thanks for your help.