I'm definitely not the most mathy person on the list, but I think there's
something about the complex exponentials, real transforms and the 2-point
case. For all real DFTs you should get a real-valued sample at DC and
Nyquist, which indeed you do get with your matrix. However, there should be
some
You can combine consecutive DFTs. Intuitively, the basis functions are
periodic on the transform length. But it won't be as efficient as having
done the big FFT (as you say, the decimation in time approach interleaves
the inputs, so you gotta pay the piper to unwind that). Note that this is
for
Well, I dunno shit about the history also. I just ascribed all of the radix-2
FFT to Cooley and Tukey.But I think you're mistaken about the technical claim.
If you have or can get Oppenheim and Schafer and go to the FFT chapter of
whatever revision you have, and there are several
I don't think that's correct -- DIF involves first doing a single stage of
butterfly operations over the input, and then doing two smaller DFTs on
that preprocessed data. I don't think there is any reasonable way to take
two "consecutive" DFTs of the raw input data and combine them into a longer
�
Ethan, that's just the difference between Decimation-in-Frequency FFT and
Decimation-in-Time FFT.
i guess i am not entirely certainly of the history, but i credited both the DIT
and DIF FFT to Cooley and Tukey.� that might be an incorrect historical
impression.
Am 05.11.2018 um 16:17 schrieb Ethan Fenn:
Of course it's possible you'll be able to come up with a clever
frequency estimator using this information. I'm just saying it won't
be exact in the way Cooley-Tukey is.
Maybe, but not the way I laid it out.
Also it seems wiser to interpolate
It's not exactly Cooley-Tukey. In Cooley-Tukey you take two _interleaved_
DFT's (that is, the DFT of the even-numbered samples and the DFT of the
odd-numbered samples) and combine them into one longer DFT. But here you're
talking about taking two _consecutive_ DFT's. I don't think there's any