This changes how request_channel_layout is handled for downmixed output, allows printing libdcadec messages through av_log and adds support for setting some potentially useful options.
Forcing 5.1 to 2.0 downmix by using undocumented libdcadec feature (specifying both 2CH and 6CH flags together) is removed. With 2CH flag used alone, 2.0 downmix is output whenever possible, but if there are no custom downmixing coefficients present, 5.1 downmix is output instead (which is always present for 6.1 and 7.1 content). This allows the caller to perform 5.1 to 2.0 downmix using whatever downmixing parameters it likes, instead of using default libdcadec coefficients, which may result in clipping depending on the stream. Warnings about missing coefficients are now removed because no downmix is forced. Options to select LFE channel interpolation filter in bit-inexact mode and force core only decoding are added. Unfortunately, they don't seem to apply during probing, e.g. ffmpeg -c:a libdcadec -core_only:a 1 -i master-audio-71.dtshd -f null - will intially report "Audio: dts (DTS-HD MA), 48000 Hz, 7.1, s32p (24 bit)", and then switch to 5.1 core only output. I'm not sure how to fix this. Setting "-err_detect explode" now causes libdcadec to abort decoding a frame on non-fatal errors like core extension decoding failure. foo86 (4): avcodec/libdcadec: fix request_channel_layout avcodec/libdcadec: implement logging callback avcodec/libdcadec: add some useful options avcodec/libdcadec: honor -err_detect option libavcodec/libdcadec.c | 97 ++++++++++++++++++++++++++++++++++++++------------ 1 file changed, 75 insertions(+), 22 deletions(-) -- 2.1.4 _______________________________________________ ffmpeg-devel mailing list ffmpeg-devel@ffmpeg.org http://ffmpeg.org/mailman/listinfo/ffmpeg-devel