On Wed, Oct 12, 2016 at 3:58 AM, Anton Khirnov <an...@khirnov.net> wrote:
> Quoting Hendrik Leppkes (2016-10-11 22:33:43)
>> On Tue, Oct 11, 2016 at 9:34 PM, Anton Khirnov <an...@khirnov.net> wrote:
>> > This format is used internally by the QSV encoder to store the encoded
>> > bitstream.
>> What does that even mean?
>> It smells like some evil hackery going into the nice clean hwcontext.
> Moderately evil.
> If you want to use gpu surfaces with QSV, you need to supply a frame
> allocator, which will be invoked to pass surface pools to it. For
> encoding, this allocator gets invoked not only for the pool of input
> frames, but also for a separate pool of (I assume) reconstructed frames
> and another pool of MFX_FOURCC_P8, which on Windows needs to return
> D3DFMT_P8 d3d surfaces. I think those are used to store the encoded
> bitstream on the GPU.
> In any case, while using P8 surfaces for this purpose is indeed rather
> strange, there's nothing especially wrong about supporting them in the
> DXVA2 hwcontext, it's just another surface format. The only hacky part
> is the actual palette itself -- I didn't find a way to retrieve the
> palette from a surface, so it's "emulated" by a dummy zero-filled
> buffer. The actual contents are irrelevant for my specific use case (and
> I don't expect it will be used for anything else).
> --

Can we have this detailed explanation in the commit log please?
libav-devel mailing list

Reply via email to