Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for realtime synthesis?

2018-11-05 Thread Stefan Sullivan
I'm definitely not the most mathy person on the list, but I think there's
something about the complex exponentials, real transforms and the 2-point
case. For all real DFTs you should get a real-valued sample at DC and
Nyquist, which indeed you do get with your matrix. However, there should be
some complex numbers in a matrix for a 4-point DFT, which you won't get no
matter how many matrices of that form you multiply together. My guess is
that yours is a special case of a DFT Matrix for 2 bins. I suspect if you
took a 4-point DFT Matrix and tried the same it might work out better?

https://en.wikipedia.org/wiki/DFT_matrix

Stefan

On Mon, Nov 5, 2018, 12:40 Ethan Duni  You can combine consecutive DFTs. Intuitively, the basis functions are
> periodic on the transform length. But it won't be as efficient as having
> done the big FFT (as you say, the decimation in time approach interleaves
> the inputs, so you gotta pay the piper to unwind that). Note that this is
> for naked transforms of successive blocks of inputs, not a WOLA filter
> bank.
>
> There are Dolby codecs that do similar with a suitable flavor of DCT (type
> II I think?) - you have your encoder going along at the usual frame rate,
> but if it detects a string of stationary inputs it can fold them together
> into one big high-res DCT and code that instead.
>
> On Mon, Nov 5, 2018 at 11:34 AM Ethan Fenn  wrote:
>
>> I don't think that's correct -- DIF involves first doing a single stage
>> of butterfly operations over the input, and then doing two smaller DFTs on
>> that preprocessed data. I don't think there is any reasonable way to take
>> two "consecutive" DFTs of the raw input data and combine them into a longer
>> DFT.
>>
>> (And I don't know anything about the historical question!)
>>
>> -Ethan
>>
>>
>>
>> On Mon, Nov 5, 2018 at 2:18 PM, robert bristow-johnson <
>> r...@audioimagination.com> wrote:
>>
>>>
>>>
>>> Ethan, that's just the difference between Decimation-in-Frequency FFT
>>> and Decimation-in-Time FFT.
>>>
>>> i guess i am not entirely certainly of the history, but i credited both
>>> the DIT and DIF FFT to Cooley and Tukey.  that might be an incorrect
>>> historical impression.
>>>
>>>
>>>
>>>  Original Message
>>> 
>>> Subject: Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for
>>> realtime synthesis?
>>> From: "Ethan Fenn" 
>>> Date: Mon, November 5, 2018 10:17 am
>>> To: music-dsp@music.columbia.edu
>>>
>>> --
>>>
>>> > It's not exactly Cooley-Tukey. In Cooley-Tukey you take two
>>> _interleaved_
>>> > DFT's (that is, the DFT of the even-numbered samples and the DFT of the
>>> > odd-numbered samples) and combine them into one longer DFT. But here
>>> you're
>>> > talking about taking two _consecutive_ DFT's. I don't think there's any
>>> > cheap way to combine these to exactly recover an individual bin of the
>>> > longer DFT.
>>> >
>>> > Of course it's possible you'll be able to come up with a clever
>>> frequency
>>> > estimator using this information. I'm just saying it won't be exact in
>>> the
>>> > way Cooley-Tukey is.
>>> >
>>> > -Ethan
>>> >
>>> >
>>>
>>>
>>>
>>> --
>>>
>>> r b-j r...@audioimagination.com
>>>
>>> "Imagination is more important than knowledge."
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> ___
>>> dupswapdrop: music-dsp mailing list
>>> music-dsp@music.columbia.edu
>>> https://lists.columbia.edu/mailman/listinfo/music-dsp
>>>
>>
>> ___
>> dupswapdrop: music-dsp mailing list
>> music-dsp@music.columbia.edu
>> https://lists.columbia.edu/mailman/listinfo/music-dsp
>
> ___
> dupswapdrop: music-dsp mailing list
> music-dsp@music.columbia.edu
> https://lists.columbia.edu/mailman/listinfo/music-dsp
___
dupswapdrop: music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for realtime synthesis?

2018-11-05 Thread Ethan Duni
You can combine consecutive DFTs. Intuitively, the basis functions are
periodic on the transform length. But it won't be as efficient as having
done the big FFT (as you say, the decimation in time approach interleaves
the inputs, so you gotta pay the piper to unwind that). Note that this is
for naked transforms of successive blocks of inputs, not a WOLA filter
bank.

There are Dolby codecs that do similar with a suitable flavor of DCT (type
II I think?) - you have your encoder going along at the usual frame rate,
but if it detects a string of stationary inputs it can fold them together
into one big high-res DCT and code that instead.

On Mon, Nov 5, 2018 at 11:34 AM Ethan Fenn  wrote:

> I don't think that's correct -- DIF involves first doing a single stage of
> butterfly operations over the input, and then doing two smaller DFTs on
> that preprocessed data. I don't think there is any reasonable way to take
> two "consecutive" DFTs of the raw input data and combine them into a longer
> DFT.
>
> (And I don't know anything about the historical question!)
>
> -Ethan
>
>
>
> On Mon, Nov 5, 2018 at 2:18 PM, robert bristow-johnson <
> r...@audioimagination.com> wrote:
>
>>
>>
>> Ethan, that's just the difference between Decimation-in-Frequency FFT and
>> Decimation-in-Time FFT.
>>
>> i guess i am not entirely certainly of the history, but i credited both
>> the DIT and DIF FFT to Cooley and Tukey.  that might be an incorrect
>> historical impression.
>>
>>
>>
>>  Original Message 
>> Subject: Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for
>> realtime synthesis?
>> From: "Ethan Fenn" 
>> Date: Mon, November 5, 2018 10:17 am
>> To: music-dsp@music.columbia.edu
>> --
>>
>> > It's not exactly Cooley-Tukey. In Cooley-Tukey you take two
>> _interleaved_
>> > DFT's (that is, the DFT of the even-numbered samples and the DFT of the
>> > odd-numbered samples) and combine them into one longer DFT. But here
>> you're
>> > talking about taking two _consecutive_ DFT's. I don't think there's any
>> > cheap way to combine these to exactly recover an individual bin of the
>> > longer DFT.
>> >
>> > Of course it's possible you'll be able to come up with a clever
>> frequency
>> > estimator using this information. I'm just saying it won't be exact in
>> the
>> > way Cooley-Tukey is.
>> >
>> > -Ethan
>> >
>> >
>>
>>
>>
>> --
>>
>> r b-j r...@audioimagination.com
>>
>> "Imagination is more important than knowledge."
>>
>>
>>
>>
>>
>>
>>
>>
>> ___
>> dupswapdrop: music-dsp mailing list
>> music-dsp@music.columbia.edu
>> https://lists.columbia.edu/mailman/listinfo/music-dsp
>>
>
> ___
> dupswapdrop: music-dsp mailing list
> music-dsp@music.columbia.edu
> https://lists.columbia.edu/mailman/listinfo/music-dsp
___
dupswapdrop: music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for realtime synthesis?

2018-11-05 Thread robert bristow-johnson


Well, I dunno shit about the history also.  I just ascribed all of the radix-2 
FFT to Cooley and Tukey.But I think you're mistaken about the technical claim.  
If you have or can get Oppenheim and Schafer and go to the FFT chapter of 
whatever revision you have, and there are several different 8 point FFTs that 
they illustrate.--r b-j                     
r...@audioimagination.com"Imagination is more important than knowledge."

 Original message 
From: Ethan Fenn  
Date: 11/5/2018  11:34 AM  (GMT-08:00) 
To: robert bristow-johnson , 
music-dsp@music.columbia.edu 
Subject: Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for realtime 
synthesis? 

I don't think that's correct -- DIF involves first doing a single stage of 
butterfly operations over the input, and then doing two smaller DFTs on that 
preprocessed data. I don't think there is any reasonable way to take two 
"consecutive" DFTs of the raw input data and combine them into a longer 
DFT.(And I don't know anything about the historical question!)-EthanOn Mon, Nov 
5, 2018 at 2:18 PM, robert bristow-johnson  wrote: 
Ethan, that's just the difference between Decimation-in-Frequency FFT and 
Decimation-in-Time FFT.i guess i am not entirely certainly of the history, but 
i credited both the DIT and DIF FFT to Cooley and Tukey.  that might be an 
incorrect historical impression.
 Original Message 
Subject: Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for realtime 
synthesis?
From: "Ethan Fenn" 
Date: Mon, November 5, 2018 10:17 am
To: music-dsp@music.columbia.edu
--

> It's not exactly Cooley-Tukey. In Cooley-Tukey you take two _interleaved_
> DFT's (that is, the DFT of the even-numbered samples and the DFT of the
> odd-numbered samples) and combine them into one longer DFT. But here you're
> talking about taking two _consecutive_ DFT's. I don't think there's any
> cheap way to combine these to exactly recover an individual bin of the
> longer DFT.
>
> Of course it's possible you'll be able to come up with a clever frequency
> estimator using this information. I'm just saying it won't be exact in the
> way Cooley-Tukey is.
>
> -Ethan
>
>
 
--

r b-j                         r...@audioimagination.com

"Imagination is more important than knowledge."
___
dupswapdrop: music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp
___
dupswapdrop: music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for realtime synthesis?

2018-11-05 Thread Ethan Fenn
I don't think that's correct -- DIF involves first doing a single stage of
butterfly operations over the input, and then doing two smaller DFTs on
that preprocessed data. I don't think there is any reasonable way to take
two "consecutive" DFTs of the raw input data and combine them into a longer
DFT.

(And I don't know anything about the historical question!)

-Ethan



On Mon, Nov 5, 2018 at 2:18 PM, robert bristow-johnson <
r...@audioimagination.com> wrote:

>
>
> Ethan, that's just the difference between Decimation-in-Frequency FFT and
> Decimation-in-Time FFT.
>
> i guess i am not entirely certainly of the history, but i credited both
> the DIT and DIF FFT to Cooley and Tukey.  that might be an incorrect
> historical impression.
>
>
>
>  Original Message 
> Subject: Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for
> realtime synthesis?
> From: "Ethan Fenn" 
> Date: Mon, November 5, 2018 10:17 am
> To: music-dsp@music.columbia.edu
> --
>
> > It's not exactly Cooley-Tukey. In Cooley-Tukey you take two _interleaved_
> > DFT's (that is, the DFT of the even-numbered samples and the DFT of the
> > odd-numbered samples) and combine them into one longer DFT. But here
> you're
> > talking about taking two _consecutive_ DFT's. I don't think there's any
> > cheap way to combine these to exactly recover an individual bin of the
> > longer DFT.
> >
> > Of course it's possible you'll be able to come up with a clever frequency
> > estimator using this information. I'm just saying it won't be exact in
> the
> > way Cooley-Tukey is.
> >
> > -Ethan
> >
> >
>
>
>
> --
>
> r b-j r...@audioimagination.com
>
> "Imagination is more important than knowledge."
>
>
>
>
>
>
>
>
> ___
> dupswapdrop: music-dsp mailing list
> music-dsp@music.columbia.edu
> https://lists.columbia.edu/mailman/listinfo/music-dsp
>
___
dupswapdrop: music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for realtime synthesis?

2018-11-05 Thread robert bristow-johnson



�
Ethan, that's just the difference between Decimation-in-Frequency FFT and 
Decimation-in-Time FFT.
i guess i am not entirely certainly of the history, but i credited both the DIT 
and DIF FFT to Cooley and Tukey.� that might be an incorrect historical 
impression.



 Original Message 

Subject: Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for realtime 
synthesis?

From: "Ethan Fenn" 

Date: Mon, November 5, 2018 10:17 am

To: music-dsp@music.columbia.edu

--



> It's not exactly Cooley-Tukey. In Cooley-Tukey you take two _interleaved_

> DFT's (that is, the DFT of the even-numbered samples and the DFT of the

> odd-numbered samples) and combine them into one longer DFT. But here you're

> talking about taking two _consecutive_ DFT's. I don't think there's any

> cheap way to combine these to exactly recover an individual bin of the

> longer DFT.

>

> Of course it's possible you'll be able to come up with a clever frequency

> estimator using this information. I'm just saying it won't be exact in the

> way Cooley-Tukey is.

>

> -Ethan

>

>

�


--



r b-j� � � � � � � � � � � � �r...@audioimagination.com



"Imagination is more important than knowledge."

�
�
�
�
___
dupswapdrop: music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for realtime synthesis?

2018-11-05 Thread gm




Am 05.11.2018 um 16:17 schrieb Ethan Fenn:


Of course it's possible you'll be able to come up with a clever 
frequency estimator using this information. I'm just saying it won't 
be exact in the way Cooley-Tukey is.


Maybe, but not the way I laid it out.

Also it seems wiser to interpolate spektral peaks, as had been suggested 
to me before.


But it doesn's sound much better then to get the frequency from the 
phase step so the bad sound for frequency shifts at an FFT size of 2048 
has other reasons than just a bad phase estimate.
Maybe it's just stupid to find a solution for this FFT size and a 
frequency domain shift.


___
dupswapdrop: music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp



Re: [music-dsp] 2-point DFT Matrix for subbands Re: FFT for realtime synthesis?

2018-11-05 Thread Ethan Fenn
It's not exactly Cooley-Tukey. In Cooley-Tukey you take two _interleaved_
DFT's (that is, the DFT of the even-numbered samples and the DFT of the
odd-numbered samples) and combine them into one longer DFT. But here you're
talking about taking two _consecutive_ DFT's. I don't think there's any
cheap way to combine these to exactly recover an individual bin of the
longer DFT.

Of course it's possible you'll be able to come up with a clever frequency
estimator using this information. I'm just saying it won't be exact in the
way Cooley-Tukey is.

-Ethan



On Mon, Nov 5, 2018 at 12:28 AM, gm  wrote:

>
>
> Am 05.11.2018 um 01:56 schrieb gm:
>
>> so you do the "radix 2 algorithm" if you will on a subband, and now what?
>> the bandlimits are what? the neighbouring upper and lower bands?
>>
>> how do I get a frequency estimate "in between" out of these two real
>> values that describe the upper and lower limit of the band but have no
>> further information?
>>
>> thank you.
>>
> The way I see it:
>
> If you do that 2 point transform on a band you get 2 data points instead
> of one (or rather instead of two sucsessive ones of course), representing
> the upper and lower bandwith limit of the band, but not very well seperated.
> But if you take the result of the previous frame also into account you now
> get 4 points representing the corner of a bin
> of the original spectrum so to say, however in bewteen spectra, and you
> now can do bilinear interpolation between these 4 points.
>
> But in the end this is just crude averaging between two sucessive spectra,
> and I am not sure if it sounded better
> or worse. It's hard to tell a difference, it works quite well on a sine
> sweep though.
>
> But there must be a better way to make use of these 2 extra data points.
>
> In the end you now have the same amount of information as with a spectrum
> of double size.
> So you should be able to obtain the same quality from that.
> That was my way of thinking, however flawed that is, I'd like to know.
>
>
>
> ___
> dupswapdrop: music-dsp mailing list
> music-dsp@music.columbia.edu
> https://lists.columbia.edu/mailman/listinfo/music-dsp
>
>
___
dupswapdrop: music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp