On Mon, 24 Mar 2014 23:51:53 +0200 Rémi Denis-Courmont <[email protected]> wrote:
> Le lundi 24 mars 2014, 22:36:29 Luca Barbato a écrit : > > On 24/03/14 22:19, Rémi Denis-Courmont wrote: > > > Le lundi 24 mars 2014, 21:46:54 Luca Barbato a écrit : > > >>> If suddenly libavcodec started offering AV_PIX_FMT_VDPAU for 4:2:2, > > >>> decoding would break. VLC would not know that suddenly a pixel format > > >>> that was 4:2:0 has become 4:2:2 (too). > > >> > > >> Nothing would happen "suddenly", you set in the global context which is > > >> the pixel format (so you know by the time your callback is it in > > >> get_format). > > > > > > Oh yes it will break. > > > > > > The hardware might be able to decode a 420 bitstream into a 422 surface. > > > But it definitely will not be able to decode a 422 bitstream into a 420 > > > surface, because the memory allocation will be too small. > > > > Who is asking the hardware to do the decoding is the same setting the > > surface is the same software. > > > > > Thus all applications that assume that AV_PIX_FMT_VDPAU or > > > AV_PIX_FMT_VAAPI is 420 will break. Even your own avconv_vdpau will > > > break! > > > > If it does there is a bug and will be fixed. > > That is not a bugfix. That is a redefinition of semantics after the fact in > incompatible ways, a.k.a. silent ABI and API break. > > AV_PIX_FMT_VAAPI and AV_PIX_FMT_VDPAU mean 4:2:0 8-bits YUV so far. If you > want to change that, you need to define new enumeration values and/or a > get_format2() callback. Otherwise that is a silent API and ABI breakage. > > > >>> It would therefore allocate VDP_CHROMA_TYPE_420 > > >>> surfaces from the hardware. That is obviously wrong and a backward > > >>> compatibility breakage. > > >>> > > >>> The correct approach, should the need arise, would be to define > > >>> AV_PIX_FMT_VDPAU_422 or whatever. Then old VLC versions would ignore and > > >>> fall back to software decoding as before. New versions can add support > > >>> safely. > > >> > > >> No, and you got 3 different explanations why it is wrong. > > > > > > The only explanation I got was that it is possible for an application to > > > work around the problem by checking the last pixel format. That is an > > > ugly, limiting work-around and does not solve backward compatibility in > > > any way. > > What I told you is that libavcodec knows _nothing_ about that since both > > the global context setup and the get buffer is on the user. > > What I keep telling you is that the application needs to know the chroma > type, > and that the one that knows it is libavcodec. > > I agree that libavcodec does not need to know the chroma type so far as it is > not allocation the surfaces. But it needs to convey the chroma type to the > application, and the pixel format seem to be the obvious way to achieve it. Not sure about this. Different decoders might support different pixel formats for the same thing. For example, it seems VDA always prefers packed 422, even for 420. Likewise, there could be decoders which support hi10p decoding, but pretend the surfaces are normal 420 8 bit surfaces. (There was something about adding 10 bit support for vdpau some months ago. I'm not sure how that was supposed to work or if it was just a misunderstanding, but it looked like it was pretending to return 8 bit surfaces for 10 bit decoding.) Is it really that straight-forward? Maybe separating this would be better. We don't store the codec profile in the pixel format either. And when we did that, it was ugly and we eventually removed it (like AV_PIX_FMT_VDPAU_H264). IMO it's best if AV_PIX_FMT_VDPAU means that it's a VdpVideoSurface, instead of attempting to encode more into it. _______________________________________________ libav-devel mailing list [email protected] https://lists.libav.org/mailman/listinfo/libav-devel
