On Fri, Jun 14, 2002 at 08:30:45 -0400, Charles Baker wrote: > what this and the earlier gedankenexperiment about the jazz drummer on an > extremely upbeat tune implies ( and I tend to think it is true) is that > humans can judge delays between direct and indirect sound clearly into the > realm of phase relationships between the two signals (again, giving lie to > the homily I was taught 15 years ago about humans not perceiving phase > relationships in sound.)
I believe this is usually said about phase within a sound, not realtive phase which will obviously have audible effects (due to superposition). You can test this by taking a sawtooth and phase shifting it. It still sounds like a sawtooth, though it looks like a mess. > I think that we *should* aim for latencies that make drummers and guitarists > happy, but I feel that latency has been given too much time recently in > computer music real-time issues. More important to me is the lack of control I agree. For me another important things about computer music are the infinte possibilities it offers. Instead of randomly buying new synths I can code/patch my own and come up with some interesting new sounds. That said, support for conventional recording is important as a service to humainty ;) and I heartily support all the amazing work that has been put in by the 64 frames or bust crowd. - Steve
