Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-08-09 Thread Sampo Syreeni

On 2015-07-20, Tom Duffy wrote:

Using separate reverbs on each instrument in a DAW recording gives a 
richer mix that just a single reverb on the master channel.


What it gives you is higher decorrelation across channels. And our ears 
are used to that, because as soon as you move a sound source even one 
metre in an enclosed, reverberant space, the precise reverberation 
pattern changes drastically. We perceptually expect a lot of 
decorrelation from the decaying part of a reverberant sound...though at 
the same time less from the early, distinguisable slap echoes. (Or, 
let's say, we expect a different kind of decorrelation; in the short 
time frame interaural decorrelation because of delay, and in the longer 
frame essential whiteness overall.)


Tom, look at how DirAC processes its arrivals. Starting from Ville 
Pulkki's research at then TKK Acoustics Lab, and now continuing at 
Aalto. It's entirely predicated on this sort of thing in its reverb leg.

--
Sampo Syreeni, aka decoy - de...@iki.fi, http://decoy.iki.fi/front
+358-40-3255353, 025E D175 ABE5 027C 9494 EEB0 E090 8BA9 0509 85C2
___
music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp


Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-21 Thread robert bristow-johnson

On 7/20/15 7:49 PM, Nigel Redmon wrote:

To add to Robert’s comment on discrete-time analog…

The only thing special about digital sampling is that it’s stable (those 
digital numbers can be pretty durable—the analog samples don’t hold up so well) 
and convenient for computation. But the digital audio samples are just a 
representation of the analog audio that’s been pulse amplitude modulated. (I 
never worked with BBDs or CCDs, and suspect there’s some zero-order hold 
involved in practical implementations,


there's gotta be *some* voltage at the output at all times.  doubt that 
it's return to zero, so ZOH makes the most sense.



  but it doesn’t matter as long as that’s compensated for—digital to analog 
converters have also dealt with the same sort of issue. Still, the basis is 
that those samples in the BBD/CCD represent impulses, momentary snapshots.) 
Just as with the digital versions, in the analog versions you have a lowpass 
filter to ensure the input spectrum remains below half the sample rate, and on 
the output you have a filter to get rid of the aliased images, created by the 
modulation process.

In the early days of digital delays, the analog delays had some advantages that 
are probably not obvious to someone coming from today’s knowledge. For 
instance, today we’d make a delay with a constant sample rate, and use a 
software LFO and an interpolated delay line to make a flanger. But back then, 
computation was difficult and costly, so it was done the same way that the 
analog delays did it: variable sample rate and vary the clock frequency with a 
hardware LFO. The advantage of digital was better fidelity, but the analog 
delays could sweep over a much wider range. Digital memory wasn’t so fast back 
then, and super-fast A/Ds were huge bucks (I worked for a group in a company in 
the late ‘70s that made a 40 MHz 8-bit A/D chip that was $800 in low 
quantities, and they sold ‘em as fast as they could make ‘em).


geepers.  that's fast.  around 1979-80, i did a DSP project with a 
MC6809 and a 12-bit DAC that i double-up and used with a comparator to 
be a successive approximation ADC.  in those days the DAC was $40 and we 
didn't wanna spend money getting an ADC.  the sampling rate was 
something like 10 Hz (it was a medical application and the signal was 
very slow.)


--

r b-j  r...@audioimagination.com

Imagination is more important than knowledge.



--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-21 Thread Nigel Redmon

 On Jul 20, 2015, at 11:12 PM, robert bristow-johnson 
 r...@audioimagination.com wrote:
 
 On 7/20/15 7:49 PM, Nigel Redmon wrote:
 To add to Robert’s comment on discrete-time analog…
 
 The only thing special about digital sampling is that it’s stable (those 
 digital numbers can be pretty durable—the analog samples don’t hold up so 
 well) and convenient for computation. But the digital audio samples are just 
 a representation of the analog audio that’s been pulse amplitude modulated. 
 (I never worked with BBDs or CCDs, and suspect there’s some zero-order hold 
 involved in practical implementations,
 
 there's gotta be *some* voltage at the output at all times.  doubt that it's 
 return to zero, so ZOH makes the most sense.

RIght. What I meant to imply is that the (mathematical) ideal is an impulse 
(return to zero), but for practical reasons it’s basically ZOH and you make 
adjustments.

  but it doesn’t matter as long as that’s compensated for—digital to analog 
 converters have also dealt with the same sort of issue. Still, the basis is 
 that those samples in the BBD/CCD represent impulses, momentary snapshots.) 
 Just as with the digital versions, in the analog versions you have a lowpass 
 filter to ensure the input spectrum remains below half the sample rate, and 
 on the output you have a filter to get rid of the aliased images, created by 
 the modulation process.
 
 In the early days of digital delays, the analog delays had some advantages 
 that are probably not obvious to someone coming from today’s knowledge. For 
 instance, today we’d make a delay with a constant sample rate, and use a 
 software LFO and an interpolated delay line to make a flanger. But back 
 then, computation was difficult and costly, so it was done the same way that 
 the analog delays did it: variable sample rate and vary the clock frequency 
 with a hardware LFO. The advantage of digital was better fidelity, but the 
 analog delays could sweep over a much wider range. Digital memory wasn’t so 
 fast back then, and super-fast A/Ds were huge bucks (I worked for a group in 
 a company in the late ‘70s that made a 40 MHz 8-bit A/D chip that was $800 
 in low quantities, and they sold ‘em as fast as they could make ‘em).
 
 geepers.  that's fast.  around 1979-80, i did a DSP project with a MC6809 and 
 a 12-bit DAC that i double-up and used with a comparator to be a successive 
 approximation ADC.  in those days the DAC was $40 and we didn't wanna spend 
 money getting an ADC.  the sampling rate was something like 10 Hz (it was a 
 medical application and the signal was very slow.)

These 8-bit ADCs were “flash converters (a string of resistors with 
comparators feeding a MUX), usually used in video applications. They dropped to 
$500 in quantities…or, you could buy ones with a missing code or two cheaper, 
and correct for it in software, as some people on a budget did. They also made 
those popular 16x16 multipliers and MACs (MPY-16 and MAC-16) that people would 
make hardware FFT butterflies with. It runs in my mind that the Bell Labs 
(Alles) synth used a bunch of the multipliers. Now imagine a board full of 
these things, dissipating 5W each (the Mac-16s anyway—the MPY-16s were a bit 
less as I recall)…LOL.

One cool thing about the 6809 (because a multiply) was that they did all memory 
access on a half-cycle, so you could put two of them on the same memory out of 
phase to do more crunching.

--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-21 Thread padawa...@obiwannabe.co.uk

 On 20 July 2015 at 22:24 Nigel Redmon earle...@earlevel.com wrote:


 Here’s an interesting interview:

 http://www.studioelectronics.biz/Documents/SSC.DEVICE.pdf
 http://www.studioelectronics.biz/Documents/SSC.DEVICE.pdf


Thanks for sharing that delightfully inspiring read to find
in my inbox this morning.

So many timeless observations and patterns in the intro. The bit
about choosing whether to go to market with half-bakery that would
sell heaps but ultimately damage the company reputation (people
really cared about long term reputation in those days), or perfect
a product is notable. As is the transfer of ideas to a music
technology domain from work he was doing in medical electronics.
And gricky definitely needs to be an official delay control
parameter in honour of St Croix.

 
cheers
Andy
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-21 Thread Theo Verelst

robert bristow-johnson wrote:

...
just for the record, none of them content words were written by me.
...

And it's back, in the Prophet-6. I build one of those dual BBD effects
with filters and electronics, with interesting options, with a sine
wave LFO modulated clock to act as a Leslie effect, which was fun,
though without noise suppression very hissy.


so it's delay modulation, which is supposed to be the Leslie?  you
really should have a synchronous, but outa-phase, amplitude modulation
along with the delay modulation, to emulate a Leslie.  and multiple
reflection paths to an auditory scene (a wall or two with reflections)
and a stereo output derived from that.



I was talking about the new P6 having BBDs (/simulation). Not directly 
connected with that, I used BBDs in the early 80s for simulating among 
other things the hard to do phase shifting of a imitated organ signal, 
with an added compander I designed. Nowhere near the sonic riches of 
good digital simulations from later time, but it sounded not eeky, or 
those to me dreadful serene listen to this messing with sampling 
errors way. I don't know how much error was in the balanced BBD I used, 
probably there was leaking between parts of charge passing stages, and 
forms of unspecified filtering. It was fun to just modulate the clock 
analog, like there were also digital delays in that time that would let 
you smoothly modulate the sampling clock. Doing the same proper with a 
digital simulation *including correction for sampling errors* isn't 
necessarily easy.



That sure is better even with 

certain

synthesizers (in this case the Yamaha Motif line) have nice
oscillators, but it isn't possible to get certain sampled waves to not
overlay more than 1 sample,

...
uhm, what do you mean?  do you mean that the samples for each voice are
being played out at different sample rates ...


What I mean is that for sound reasons and possibly for preserving 
intellectual property reasons (I don't know), the machines in many cases 
output more than one sample at the same time, even if you take one 
oscillator and one note is played, it outputs a combined waveform 
consisting at least (this has been a while since I looked at it) of two 
time shifted versions of the same sample. So the assignment would be 
to take a source which outputs a layer of the same sample, possibly (so 
let's presume) at the same frequency, but the layers shifted in time. SO 
you put all modulations, envelopes and filters of a Motif synth off, 
output a string wave form from only one oscillator, and you'd get two 
waves, in the simplest case I would like to get the sample out of some 
un-add delay effect which was layered and time shifted at the output 
of the synthesizer, so that out the delay-remover effect, I'd get the 
sample used in the synth.


So essentially, you'd have to estimate the delay time used, and undo the 
adding of the delayed signal. Going frequency domain is fine, but some 
work and might not give sample accurate delay removal!







I realize that's a bit tough and might involve inversions with linear
algebra and some iterations even, but it's a fun subject. I mean so
much going on, but simply extracting a signal in the presence of a
delayed added version of the same signal isn't generally available!



you mean inverting the filter:

  H(s) =  1 + alpha*e^(-s*T)

where T is the delay and alpha is the relative gain of the delayed added
version?  that can be generally done if you know T and alpha.





T.
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp


Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread Nigel Redmon
Er, minor correction, the effect I was talking about on the tune (where the 
echo is more part of the sound than perceived as a repeat) is the bass and the 
textural chordal change thing most easily heard in the sparse section starting 
at 1:37; my buddy added all the mallet things with echo (still cool, just 
differentiating what in my mind are two completely different uses of echo).


 On Jul 20, 2015, at 11:29 AM, Nigel Redmon earle...@earlevel.com wrote:
 
 Being a long-time fan of delays (and author of Echo farm PT plug-in and DL4 
 delay modeler stompbox), starting with tape delay (first a Univox tape delay, 
 graduated to Roland Space Echo (the space echo emulation in Echo Farm is 
 based on my aged RE-101)…when the digital first came in, it was neat at 
 first, but exact (near exact) delay is so boring after a bit, and your 
 realize that the rapid drop-off of frequencies in analog delays is a feature, 
 not a fault, and certainly the pitch consistency of tape echoes. My old prog 
 band recorded an album in 1979, and the engineer/producer wanted to use his 
 shiny new MXR rack delay. I completely regret not demand that we use the 
 space echo—my modular synth sounded so tiny.
 
 Anyway, I was having a conversation with my old bandmate some time back, over 
 the phone; he’s a recording engineer producer theses days, and he mentioned 
 something about delays, saying that he never quite latched onto their use 
 (the way I had). I mentioned a fun way to use them that I had always liked (I 
 guess similar to the Alan Parson’s I Robot), then after getting off the call 
 whipped up some simple changes to show him what I meant. Being the guy he is, 
 he couldn’t help but add drums and finish it out. I made a little video for 
 it (he added the echoey sparse vibraphone/marimba melodic part, not really 
 what I’m talking about; I’m referring to the baseline and the textural 
 chordal change parts, also a mallet-ish sound by constant, where the echo is 
 integral to the sound):
 
 https://youtu.be/BsNchxCglVk
 
 
 
 On Jul 20, 2015, at 9:43 AM, Theo Verelst theo...@theover.org wrote:
 
 Hi all,
 
 No theoretical dumbfounding or deep searching incantations from me this 
 Monday, but just something I've through about and that somehow has since 
 long been a part of music and analog and digital productions.
 
 I recall when I was doing some computer audio experiments say in the early 
 80s that there was this tantalizing effect that outside of special tape 
 based machines hadn't really existed as an effect for using with random 
 audio sources: the digital delay. I recall I was happy when I'd used (low 
 fidelity) AD and DA converters and a early home computer with 64 kilobytes 
 of memory to achieve an echo effect. It was fun. For musical purposes, a bit 
 later I used various digital effect units that optionally could act as a 
 delay line, and with a feedback control, as an echo unit.
 
 It seems however that with time, the charm of the effect wore off. Just like 
 nowadays some people occupy themselves with (arguably desirable) reverb 
 reduction, it seems that using a delay isn't very cool anymore, doesn't 
 necessarily make your audio workstation output prettier waves when playing a 
 nice solo, and even it makes samples sound uglier when a digital delay 
 effect is used on them, now that everybody with a computer and a sound card 
 can do some audio processing, in a way that's a shame.
 
 Some of the early charm must have been that the effect was featured in 
 popular music, and wasn't easy enough to get for a hobbyist in the 70s, and 
 possibly that the grungy and loose feel of the low bit depth and the jittery 
 or modulated AD/DA converter clock signals was only fun while it lasted. 
 Maybe instruments aren't designed to sound good with a delay effect either, 
 or there's a conflict with audio system's internal processing, and as last 
 suggestion, the studio delay effect does a little bit more than just 
 delaying that makes it so addictive...
 
 T.
 —
 
 
 --
 dupswapdrop -- the music-dsp mailing list and website:
 subscription info, FAQ, source code archive, list archive, book reviews, dsp 
 links
 http://music.columbia.edu/cmc/music-dsp
 http://music.columbia.edu/mailman/listinfo/music-dsp

--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread Nigel Redmon
Being a long-time fan of delays (and author of Echo farm PT plug-in and DL4 
delay modeler stompbox), starting with tape delay (first a Univox tape delay, 
graduated to Roland Space Echo (the space echo emulation in Echo Farm is based 
on my aged RE-101)…when the digital first came in, it was neat at first, but 
exact (near exact) delay is so boring after a bit, and your realize that the 
rapid drop-off of frequencies in analog delays is a feature, not a fault, and 
certainly the pitch consistency of tape echoes. My old prog band recorded an 
album in 1979, and the engineer/producer wanted to use his shiny new MXR rack 
delay. I completely regret not demand that we use the space echo—my modular 
synth sounded so tiny.

Anyway, I was having a conversation with my old bandmate some time back, over 
the phone; he’s a recording engineer producer theses days, and he mentioned 
something about delays, saying that he never quite latched onto their use (the 
way I had). I mentioned a fun way to use them that I had always liked (I guess 
similar to the Alan Parson’s I Robot), then after getting off the call whipped 
up some simple changes to show him what I meant. Being the guy he is, he 
couldn’t help but add drums and finish it out. I made a little video for it (he 
added the echoey sparse vibraphone/marimba melodic part, not really what I’m 
talking about; I’m referring to the baseline and the textural chordal change 
parts, also a mallet-ish sound by constant, where the echo is integral to the 
sound):

https://youtu.be/BsNchxCglVk



 On Jul 20, 2015, at 9:43 AM, Theo Verelst theo...@theover.org wrote:
 
 Hi all,
 
 No theoretical dumbfounding or deep searching incantations from me this 
 Monday, but just something I've through about and that somehow has since long 
 been a part of music and analog and digital productions.
 
 I recall when I was doing some computer audio experiments say in the early 
 80s that there was this tantalizing effect that outside of special tape based 
 machines hadn't really existed as an effect for using with random audio 
 sources: the digital delay. I recall I was happy when I'd used (low fidelity) 
 AD and DA converters and a early home computer with 64 kilobytes of memory to 
 achieve an echo effect. It was fun. For musical purposes, a bit later I used 
 various digital effect units that optionally could act as a delay line, and 
 with a feedback control, as an echo unit.
 
 It seems however that with time, the charm of the effect wore off. Just like 
 nowadays some people occupy themselves with (arguably desirable) reverb 
 reduction, it seems that using a delay isn't very cool anymore, doesn't 
 necessarily make your audio workstation output prettier waves when playing a 
 nice solo, and even it makes samples sound uglier when a digital delay effect 
 is used on them, now that everybody with a computer and a sound card can do 
 some audio processing, in a way that's a shame.
 
 Some of the early charm must have been that the effect was featured in 
 popular music, and wasn't easy enough to get for a hobbyist in the 70s, and 
 possibly that the grungy and loose feel of the low bit depth and the jittery 
 or modulated AD/DA converter clock signals was only fun while it lasted. 
 Maybe instruments aren't designed to sound good with a delay effect either, 
 or there's a conflict with audio system's internal processing, and as last 
 suggestion, the studio delay effect does a little bit more than just 
 delaying that makes it so addictive...
 
 T.
 —


--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread padawa...@obiwannabe.co.uk
Whenever vintage delays come to my mind, I hear the sound of the bucket brigade
delay lines
that were made from a chain of capacitors and switches. In the early 80s there
were many
electronic magazine articles and kits to build them. The SAD chips had a maximum
delay time
of about 200ms. Were they digital? Kind of. Were they analogue? Kind of too. A
lost technology
from gap between analogue and digital, you can hear them on a surprising number
of records,
especially early electronic.  That odd dub effect where a sound converges on a
single low
frequency is often BBD set to maximum feedback I think, but is sometimes
mistaken for tape
echo or early DDL.
 
best to all
Andy Farnell
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp


Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread jpff
The first delay of which I was aware was in the piece Echo III played on the
viola by Tim Souster in Cambridge in the early 1970s.  Not an echo  or reverb
but a cannon.  Delay was via two reel-to-reel tape machines, with a carefully
measured distance between them.  I cannot remember if it was the band
Intermodulation or 0db, but I loved the piece.  Not heard it for decades

==John ff

--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp


Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread Tom Duffy

Related:

Using separate reverbs on each instrument in a DAW recording gives a
richer mix that just a single reverb on the master channel.
Back in the analog days, you'd use the multitrack tape and mixer
to do multiple passes through the best reverb in the studio.
In the early DAW days, you'd have to do the same (because of
limited CPU power and the overhead of a good reverb plug-in).
Replacing some of the reverbs with delays gave the same result,
adding a little bit of space around each instrument that didn't
build up into a mess.
A well programmed delay would be 2nd on my list of desert island
plug-ins after a good reverb.
I think the delays are still used on music you hear on the radio,
but it's dialed back in subtlety.

---
Tom.


On 7/20/2015 9:43 AM, Theo Verelst wrote:


Hi all,

No theoretical dumbfounding or deep searching incantations from me this
Monday, but just something I've through about and that somehow has since
long been a part of music and analog and digital productions.

I recall when I was doing some computer audio experiments say in the
early 80s that there was this tantalizing effect that outside of special
tape based machines hadn't really existed as an effect for using with
random audio sources: the digital delay. I recall I was happy when I'd
used (low fidelity) AD and DA converters and a early home computer with
64 kilobytes of memory to achieve an echo effect. It was fun. For
musical purposes, a bit later I used various digital effect units that
optionally could act as a delay line, and with a feedback control, as an
echo unit.

It seems however that with time, the charm of the effect wore off. Just
like nowadays some people occupy themselves with (arguably desirable)
reverb reduction, it seems that using a delay isn't very cool anymore,
doesn't necessarily make your audio workstation output prettier waves
when playing a nice solo, and even it makes samples sound uglier when a
digital delay effect is used on them, now that everybody with a computer
and a sound card can do some audio processing, in a way that's a shame.

Some of the early charm must have been that the effect was featured in
popular music, and wasn't easy enough to get for a hobbyist in the 70s,
and possibly that the grungy and loose feel of the low bit depth and the
jittery or modulated AD/DA converter clock signals was only fun while it
lasted. Maybe instruments aren't designed to sound good with a delay
effect either, or there's a conflict with audio system's internal
processing, and as last suggestion, the studio delay effect does a
little bit more than just delaying that makes it so addictive...

T.
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, 
dsp links

http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp




NOTICE: This electronic mail message and its contents, including any attachments hereto 
(collectively, this e-mail), is hereby designated as confidential and 
proprietary. This e-mail may be viewed and used only by the person to whom it has been sent 
and his/her employer solely for the express purpose for which it has been disclosed and only in 
accordance with any confidentiality or non-disclosure (or similar) agreement between TEAC 
Corporation or its affiliates and said employer, and may not be disclosed to any other person or 
entity.



--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp


Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread Theo Verelst

robert bristow-johnson wrote:

On 7/20/15 2:44 PM, padawa...@obiwannabe.co.uk wrote:

Whenever vintage delays come to my mind, I hear the sound of the
bucket brigade
delay lines
And it's back, in the Prophet-6. I build one of those dual BBD effects 
with filters and electronics, with interesting options, with a sine wave 
LFO modulated clock to act as a Leslie effect, which was fun, though 
without noise suppression very hissy.


That sure is better even with a simple software delay and cheap built-in 
sound card now, even at 16 bit, a delay can work fine at CD quality.


My interest at some point, which got me thinking, is that certain 
synthesizers (in this case the Yamaha Motif line) have nice oscillators, 
but it isn't possible to get certain sampled waves to not overlay more 
than 1 sample, in certain cases probably the same waveform playing over 
two sample replay partial engines, with a delay in between. So it would 
be a nice idea to be able to record the signal of a single note, and 
somehow extract the one sample from the two or three that play at the 
same time, presuming they're just time shifted.


I realize that's a bit tough and might involve inversions with linear 
algebra and some iterations even, but it's a fun subject. I mean so much 
going on, but simply extracting a signal in the presence of a delayed 
added version of the same signal isn't generally available!


T.

--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp


Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread Nigel Redmon
Here’s an interesting interview:

http://www.studioelectronics.biz/Documents/SSC.DEVICE.pdf 
http://www.studioelectronics.biz/Documents/SSC.DEVICE.pdf

I first heard about it at AES (’75 in LA), from Stephen St. Croix himself. It 
was a brand new product, and Steve was trying to convince anyone who would 
listen. He was giving away cool t-shirts too, and my buddy and I wanted one. He 
was a little ticked, I think, because he could tell we were more interested in 
the t-shirts and were just waiting for him to finish and get the shirts, but he 
gave his passionate speech and I was listening more than he probably thought. 
He was basically selling against the new delta-encoded digital competition, 
telling us why it sucked, and why the wimpy clocking range (compared to his 
analog device) meant their flanging sucked, etc. He handed us our shirts and we 
were gone to see what other cool stuff was at the show.

But not too long after, the electronic music lab at USC got one, and I made 
good use of it. At the end of summer, it was stolen. I was a lab rat and was 
the last booking before then shut down for a couple of weeks ahead of the fall 
semester—and when they opened the lab next, it was gone. They got a new one, 
and identical circumstances—again, I was the last guy to book the lab int he 
summer session, and when they re-opened the new one was gone as well. It’s not 
like they cleaned out the lab—someone really like those Marshall Time 
Modulators.

So, interesting history with them. St. Croix was plagued by problems obtaining 
parts (the dvx modules, the CCDs), so I don’t think a large number were built, 
and they cost too much for me at the time. I sure loved the sound, though.


 On Jul 20, 2015, at 1:45 PM, pdowling hello.pdowl...@gmail.com wrote:
 
 Marshall Time Modulator
 
 got some good links?
 
 
 On 20 Jul 2015, at 21:40, Nigel Redmon earle...@earlevel.com wrote:
 
 Most of the old delays were BBD, but the king of discrete-time analog was 
 the Marshall Time Modulator, which used CCDs. Between the dbx companding for 
 increased s/n and the wide clock-sweeping range, it had awesome flanging 
 (-80dB notches claimed)—great double/triple tracking too.
 
 
 On Jul 20, 2015, at 12:16 PM, robert bristow-johnson 
 r...@audioimagination.com wrote:
 
 On 7/20/15 2:44 PM, padawa...@obiwannabe.co.uk wrote:
 Whenever vintage delays come to my mind, I hear the sound of the bucket 
 brigade
 delay lines
 that were made from a chain of capacitors and switches. In the early 80s 
 there
 were many
 electronic magazine articles and kits to build them. The SAD chips had a 
 maximum
 delay time
 of about 200ms. Were they digital? Kind of.
 
 no, they weren't.  not really.
 
 discrete-time is not the same as digital.
 
 
 Were they analogue? Kind of too.
 
 they were fully analog[ue].
 
 A lost technology
 from gap between analogue and digital, you can hear them on a surprising 
 number
 of records,
 especially early electronic.  That odd dub effect where a sound 
 converges on a
 single low
 frequency is often BBD set to maximum feedback I think, but is sometimes
 mistaken for tape
 echo or early DDL.
 
 to the precision of the A/D and D/A converters (which is considerable), 
 there is no reason that a modern digital delay ling can't be made to sound 
 like the old CCD (or BBD or whatever you wanna call it) delay products.  
 like an analog[ue] amplifier, you might have to model in analog 
 non-linearities, noise, buzz, hum, and interference to make it sound the 
 same.  with the exception of the non-linearities, i normally think that 
 modeling the noise and buzz leaking through is not desirable.  who knows?
 
 one thing i think might be cool is to use different delay/echo effects on 
 each string of a hex-pickup gitfiddle.  just like you might have different 
 pitch shifting done on each string.
 
 
 -- 
 
 r b-j  r...@audioimagination.com
 
 Imagination is more important than knowledge.
 
 
 
 --
 dupswapdrop -- the music-dsp mailing list and website:
 subscription info, FAQ, source code archive, list archive, book reviews, 
 dsp links
 http://music.columbia.edu/cmc/music-dsp
 http://music.columbia.edu/mailman/listinfo/music-dsp
 
 --
 dupswapdrop -- the music-dsp mailing list and website:
 subscription info, FAQ, source code archive, list archive, book reviews, dsp 
 links
 http://music.columbia.edu/cmc/music-dsp
 http://music.columbia.edu/mailman/listinfo/music-dsp
 
 --
 dupswapdrop -- the music-dsp mailing list and website:
 subscription info, FAQ, source code archive, list archive, book reviews, dsp 
 links
 http://music.columbia.edu/cmc/music-dsp
 http://music.columbia.edu/mailman/listinfo/music-dsp

--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp


Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread robert bristow-johnson

On 7/20/15 3:00 PM, jpff wrote:

The first delay of which I was aware was in the piece Echo III played on the
viola by Tim Souster in Cambridge in the early 1970s.  Not an echo  or reverb
but a cannon.  Delay was via two reel-to-reel tape machines, with a carefully
measured distance between them.  I cannot remember if it was the band
Intermodulation or 0db, but I loved the piece.  Not heard it for decades



the first i remember was the Echoplex.  single tape loop, but one of 
the heads (i think the playback head) was on a mechanical slider.  i 
think there was a feedback gain knob.


i dunno what may have preceded that.  did Echo III precede the Echoplex?


--

r b-j  r...@audioimagination.com

Imagination is more important than knowledge.



--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp


Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread Nigel Redmon
Most of the old delays were BBD, but the king of discrete-time analog was the 
Marshall Time Modulator, which used CCDs. Between the dbx companding for 
increased s/n and the wide clock-sweeping range, it had awesome flanging (-80dB 
notches claimed)—great double/triple tracking too.


 On Jul 20, 2015, at 12:16 PM, robert bristow-johnson 
 r...@audioimagination.com wrote:
 
 On 7/20/15 2:44 PM, padawa...@obiwannabe.co.uk wrote:
 Whenever vintage delays come to my mind, I hear the sound of the bucket 
 brigade
 delay lines
 that were made from a chain of capacitors and switches. In the early 80s 
 there
 were many
 electronic magazine articles and kits to build them. The SAD chips had a 
 maximum
 delay time
 of about 200ms. Were they digital? Kind of.
 
 no, they weren't.  not really.
 
 discrete-time is not the same as digital.
 
 
  Were they analogue? Kind of too.
 
 they were fully analog[ue].
 
 A lost technology
 from gap between analogue and digital, you can hear them on a surprising 
 number
 of records,
 especially early electronic.  That odd dub effect where a sound converges 
 on a
 single low
 frequency is often BBD set to maximum feedback I think, but is sometimes
 mistaken for tape
 echo or early DDL.
 
 to the precision of the A/D and D/A converters (which is considerable), there 
 is no reason that a modern digital delay ling can't be made to sound like the 
 old CCD (or BBD or whatever you wanna call it) delay products.  like an 
 analog[ue] amplifier, you might have to model in analog non-linearities, 
 noise, buzz, hum, and interference to make it sound the same.  with the 
 exception of the non-linearities, i normally think that modeling the noise 
 and buzz leaking through is not desirable.  who knows?
 
 one thing i think might be cool is to use different delay/echo effects on 
 each string of a hex-pickup gitfiddle.  just like you might have different 
 pitch shifting done on each string.
 
 
 -- 
 
 r b-j  r...@audioimagination.com
 
 Imagination is more important than knowledge.
 
 
 
 --
 dupswapdrop -- the music-dsp mailing list and website:
 subscription info, FAQ, source code archive, list archive, book reviews, dsp 
 links
 http://music.columbia.edu/cmc/music-dsp
 http://music.columbia.edu/mailman/listinfo/music-dsp

--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread robert bristow-johnson

On 7/20/15 2:44 PM, padawa...@obiwannabe.co.uk wrote:

Whenever vintage delays come to my mind, I hear the sound of the bucket brigade
delay lines
that were made from a chain of capacitors and switches. In the early 80s there
were many
electronic magazine articles and kits to build them. The SAD chips had a maximum
delay time
of about 200ms. Were they digital? Kind of.


no, they weren't.  not really.

discrete-time is not the same as digital.



  Were they analogue? Kind of too.


they were fully analog[ue].


A lost technology
from gap between analogue and digital, you can hear them on a surprising number
of records,
especially early electronic.  That odd dub effect where a sound converges on a
single low
frequency is often BBD set to maximum feedback I think, but is sometimes
mistaken for tape
echo or early DDL.


to the precision of the A/D and D/A converters (which is considerable), 
there is no reason that a modern digital delay ling can't be made to 
sound like the old CCD (or BBD or whatever you wanna call it) delay 
products.  like an analog[ue] amplifier, you might have to model in 
analog non-linearities, noise, buzz, hum, and interference to make it 
sound the same.  with the exception of the non-linearities, i normally 
think that modeling the noise and buzz leaking through is not 
desirable.  who knows?


one thing i think might be cool is to use different delay/echo effects 
on each string of a hex-pickup gitfiddle.  just like you might have 
different pitch shifting done on each string.



--

r b-j  r...@audioimagination.com

Imagination is more important than knowledge.



--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp


Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread Nigel Redmon
To add to Robert’s comment on discrete-time analog…

The only thing special about digital sampling is that it’s stable (those 
digital numbers can be pretty durable—the analog samples don’t hold up so well) 
and convenient for computation. But the digital audio samples are just a 
representation of the analog audio that’s been pulse amplitude modulated. (I 
never worked with BBDs or CCDs, and suspect there’s some zero-order hold 
involved in practical implementations, but it doesn’t matter as long as that’s 
compensated for—digital to analog converters have also dealt with the same sort 
of issue. Still, the basis is that those samples in the BBD/CCD represent 
impulses, momentary snapshots.) Just as with the digital versions, in the 
analog versions you have a lowpass filter to ensure the input spectrum remains 
below half the sample rate, and on the output you have a filter to get rid of 
the aliased images, created by the modulation process.

In the early days of digital delays, the analog delays had some advantages that 
are probably not obvious to someone coming from today’s knowledge. For 
instance, today we’d make a delay with a constant sample rate, and use a 
software LFO and an interpolated delay line to make a flanger. But back then, 
computation was difficult and costly, so it was done the same way that the 
analog delays did it: variable sample rate and vary the clock frequency with a 
hardware LFO. The advantage of digital was better fidelity, but the analog 
delays could sweep over a much wider range. Digital memory wasn’t so fast back 
then, and super-fast A/Ds were huge bucks (I worked for a group in a company in 
the late ‘70s that made a 40 MHz 8-bit A/D chip that was $800 in low 
quantities, and they sold ‘em as fast as they could make ‘em). But you probably 
sweep those CCD clocks from something like 20 Khz to over 1 MHz (kinda of 
guessing here, but the point is that you could get nowhere remotely close to 
that with a DDL).


 On Jul 20, 2015, at 12:16 PM, robert bristow-johnson 
 r...@audioimagination.com wrote:
 
 On 7/20/15 2:44 PM, padawa...@obiwannabe.co.uk wrote:
 Whenever vintage delays come to my mind, I hear the sound of the bucket 
 brigade
 delay lines
 that were made from a chain of capacitors and switches. In the early 80s 
 there
 were many
 electronic magazine articles and kits to build them. The SAD chips had a 
 maximum
 delay time
 of about 200ms. Were they digital? Kind of.
 
 no, they weren't.  not really.
 
 discrete-time is not the same as digital.
 
 
  Were they analogue? Kind of too.
 
 they were fully analog[ue].
 
 A lost technology
 from gap between analogue and digital, you can hear them on a surprising 
 number
 of records,
 especially early electronic.  That odd dub effect where a sound converges 
 on a
 single low
 frequency is often BBD set to maximum feedback I think, but is sometimes
 mistaken for tape
 echo or early DDL.
 
 to the precision of the A/D and D/A converters (which is considerable), there 
 is no reason that a modern digital delay ling can't be made to sound like the 
 old CCD (or BBD or whatever you wanna call it) delay products.  like an 
 analog[ue] amplifier, you might have to model in analog non-linearities, 
 noise, buzz, hum, and interference to make it sound the same.  with the 
 exception of the non-linearities, i normally think that modeling the noise 
 and buzz leaking through is not desirable.  who knows?
 
 one thing i think might be cool is to use different delay/echo effects on 
 each string of a hex-pickup gitfiddle.  just like you might have different 
 pitch shifting done on each string.
 
 
 -- 
 
 r b-j  r...@audioimagination.com
 
 Imagination is more important than knowledge.
 
 
 
 --
 dupswapdrop -- the music-dsp mailing list and website:
 subscription info, FAQ, source code archive, list archive, book reviews, dsp 
 links
 http://music.columbia.edu/cmc/music-dsp
 http://music.columbia.edu/mailman/listinfo/music-dsp

--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Re: [music-dsp] A little frivolous diversion on the effect of using a delay

2015-07-20 Thread robert bristow-johnson

On 7/20/15 4:52 PM, Theo Verelst wrote:

robert bristow-johnson wrote:

On 7/20/15 2:44 PM, padawa...@obiwannabe.co.uk wrote:

Whenever vintage delays come to my mind, I hear the sound of the
bucket brigade delay lines


just for the record, none of them content words were written by me.

And it's back, in the Prophet-6. I build one of those dual BBD effects 
with filters and electronics, with interesting options, with a sine 
wave LFO modulated clock to act as a Leslie effect, which was fun, 
though without noise suppression very hissy.


so it's delay modulation, which is supposed to be the Leslie?  you 
really should have a synchronous, but outa-phase, amplitude modulation 
along with the delay modulation, to emulate a Leslie.  and multiple 
reflection paths to an auditory scene (a wall or two with reflections) 
and a stereo output derived from that.


That sure is better even with a simple software delay and cheap 
built-in sound card now, even at 16 bit, a delay can work fine at CD 
quality.


My interest at some point, which got me thinking, is that certain 
synthesizers (in this case the Yamaha Motif line) have nice 
oscillators, but it isn't possible to get certain sampled waves to not 
overlay more than 1 sample, in certain cases probably the same 
waveform playing over two sample replay partial engines, with a delay 
in between. So it would be a nice idea to be able to record the signal 
of a single note, and somehow extract the one sample from the two or 
three that play at the same time, presuming they're just time shifted.


uhm, what do you mean?  do you mean that the samples for each voice are 
being played out at different sample rates and zero-order held and then 
the different voices overlay their samples coming out at different 
rates?  i might think that if you analog LPF each voice separately 
before adding them, the overlay more than 1 sample wouldn't be an issue.




I realize that's a bit tough and might involve inversions with linear 
algebra and some iterations even, but it's a fun subject. I mean so 
much going on, but simply extracting a signal in the presence of a 
delayed added version of the same signal isn't generally available!




you mean inverting the filter:

 H(s) =  1 + alpha*e^(-s*T)

where T is the delay and alpha is the relative gain of the delayed added 
version?  that can be generally done if you know T and alpha.




--

r b-j  r...@audioimagination.com

Imagination is more important than knowledge.



--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp