Never forget there are stringent theoretical limits to for instance algorthms using (normally) sampled data in realtime, waveform messing, vocoding and such. It isn't "ad infinitum" possible to completely compute the exact frequencies in signals without considering the signal between the samples which may take quite a large number of samples to do mathematically right enough (sinc functions) to satisfy e.g. CD sampling accuracy.

Also, like with all pitch shifting: certain chords made up of even a few notes will exhibit certain "beating" patterns, and the only way to accurately preserve that behavior is by doing the equivalent of a detuning like with a record player, which honestly isn't necessarily that easy to do on a sampled signal when one wants to preserve quality and phase of the higher frequency signals, especially in near real-time.

Formants are based in part on envelope detection. Without using seriously upsampled signal, major "measurement" errors will take place here otherwise, even for mid-frequency signals, e.g. when using 48kS/s.

Filters are another interesting matter in all these algortihms: chances are when you resample the same signal with a lets say half a sample different starting point, that all kind of filter algorithms will prove how quite many dBs non-sample-shift-invariant they are. With FFT based algorithms you might even have to take into account the test signal may be shifted only by a multiple of the FFT length to get the same repeatable results, hardly something useful in practice.

So these are all, contrary to what is often suggested in quasi scientific articles, approximation techniques, with pros and cons that would better be laid out clearly when doing this type of work, at least qualitatively... Also, there algorithms aren't new: there are various types and brands of professional effects processing which did this sort of stuff already in the 70s, and which unlike the software in these days were better thought through. So it isn't fair to for instance students to suggest the invention of rocket science, when that in fact has already taken place, and was of much higher (EE) standards.

This only to point at some possible errors and promoting the un-hyped talk about the quality of all kinds of algorithms as a normality which is far from the "standard model" in physics for instance.

Ir. Theo Verelst.
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to