Hi,

What started as a question at the Kodi (XMBC) forum ([1]) on how to turn off 
Dolby Digital (AC3) dynamic range compression, turned into a patch and then a 
discussion on Github [2] about the default value for the AC3 dynamic range 
compression.

Kodi, using ffmpeg, didn't set the 'drc_scale' option to avcodec, resulting in 
the DRC meta data embedded in AC3 streams being applied, to arrive at a 
compressed output. On virtually every hardware and software decoder I've seen, 
DRC has been on by default (currently trying to write a patch to VLC, having 
the same issue). This has led me to the hypothesis that this is why people 
think DTS and AAC sound better than AC3. It does for most people, but it 
doesn't have to. AC3 is given an unfair disadvantage because as far as I know, 
it's the only codec with this meta data embedded and for some reason, 
developers/manufacturers have chosen to enable it by default, even in high-end 
receivers. 

See this [2] particular post with recordings and waveform screenshots of the 
difference with DRC on and off.

So I was wondering if the default of 1 for drc_scale was chosen purposefully or 
not. I feel very strongly that it should be 0 by default, because I don't see 
why streams with AC3 (as opposed to DTS, AAC, PCM, whatever) as audio should be 
treated differently if the user didn't ask for it. 

Regards,

Wiebe


[1] http://forum.kodi.tv/showthread.php?tid=219228
[2] https://github.com/xbmc/xbmc/pull/6820
[3] http://forum.kodi.tv/showthread.php?tid=219228&pid=1958799#pid1958799
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
http://ffmpeg.org/mailman/listinfo/ffmpeg-devel

Reply via email to