> I think one of the reasons this method works so well is that the FFT > effectively averages the signal over some time, and I use a tool in > the software to derive an average across all the FFT results. That > smooths out the instantaneous variations that make real-time > measurement such a challenge.
What size FFT are you using? What sort of averaging are you doing? What is the bandwidth of the signal you are looking at? How does that compare to the bin size of your FFT? If you are recording the raw data and then post processing the signal, I'd expect you could FFT the whole thing. It has to fit in memory, but that doesn't look like a problem. I think that would get the bin size down to 1/N Hz if you had N seconds of data. (But I'm not a DSP wizard.) If you do that, there is only one sample so there is nothing to average in the time dimension. If the signal is wide enough to end up in several bins, you could average in the frequency dimension. -- The suespammers.org mail server is located in California. So are all my other mailboxes. Please do not send unsolicited bulk e-mail or unsolicited commercial e-mail to my suespammers.org address or any of my other addresses. These are my opinions, not necessarily my employer's. I hate spam. _______________________________________________ time-nuts mailing list [email protected] https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
