2017-06-02 14:12 GMT+02:00 Theo Verelst <theo...@theover.org>:
> While designing and testing an automatically created filter/multi-compressor
> bank of 30 "jack-rack" holders of (Linux) Ladspa DSP elements, I was
> reminded of
> something I've noticed before. Some audio effects use a lot of CPU resources
> when
> idle, or in other words when no signal input is present.
>
> The way I work is having Jack run it's standard 32 bit floating point audio,
> and having a lot of these racks suddenly use half a thread processor of the
> CPU a piece has two negative side effects: the CPU gets hot (when the clock
> is maxed out even too hot), and there are X-Runs (audio processing graph run
> failures) which, even when signal returns, take a while to disappear before
> normal processing resumes.
>
> I tested one solution of this problem, there's another: check out the (Open)
> Source code and fix the problem, namely: inject the signal graph with a
> -160dB tiny noise signal, which prevents it from going into overload mode.
> So I made another jack-rack with a noise generator, adjusted it to -80dB,
> limited the signal and attenuated another 80dB so that no signal processing
> I work with has serious trouble with the added noise. This appears to work:
> the processor load remains constant in the absence of signal.

This looks denormals-realted.

If that's the case, a better option would be to set processor flags
for flushing-to-zero denormals in the host or, even better, in the
plugins (FTZ and DAZ flags for x86/x86-64). Sometimes recompiling with
proper compiler flags is sufficient (e.g., -ffast-math in case of
GCC).
_______________________________________________
dupswapdrop: music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp

Reply via email to