пн, 19 мая 2025 г., 14:44 Mark Filipak <markfilipak.i...@gmail.com>:

> I had asked:
> >> Is it all right to 'talk' about the picture stream? For the various
> >> permutations of OETF (in camera), what comes out the decoder? I assume
> that
> >> for BT2020 SMPTE2084 it's HDR10 PQ, and for BT2100 it's either HDR10 PQ
> or
> >> HDR10 HLG. In other words, the decoder just makes pictures but doesn't
> do
> >> color transforms. To the best of your knowledge, is that correct?
>
> On 19/05/2025 01.57, Erik Dobberkau wrote:
> > It depends. For 'raw' streams (not meaning the camera .RAW format here),
> > one should not assume the stream to carry metadata about the encoding
> > characteristics, it's just pixel data. ... I have not yet come across a
> decoder which would
> > automatically apply a color transformation ...
>
> Hmmm... I meant FFmpeg's decoders. If a coded source was HDR10 PQ, for
> example, will the decoded
> picture stream (processing pipeline, whatever) be HDR10 PQ? I'm assuming
> it would.
>
> I have a UHD m2ts that FFmpeg identifies "yuv420p10le(tv,
> bt2020nc/bt2020/smpte2084)". To the best
> of my knowledge that means so-called "yuv420" format, planar pixel
> matrices, 10 bits-per-primary,
> little-endian bytes, with limited black..white range and -- now I have to
> guess --
> colorspace=bt2020nc, primaries=bt2020 (i.e. RGB to YUV formula),
> transfer=smpte2084 (i.e. HDR using
> PQ transfer function). Is all that correct?
>
> I'm going to assume that's all correct for what follows.
>
> In "bt2020nc", what does "nc" signify?
>
> Why is 'colorspace' there at all? Doesn't 'primaries' together with
> 'transfer' tell the whole story?
>
> I understand it's important-imperative to linearize to gbrpf32le and strip
> it down (i.e. convert) to
> SDR in order to prepare the pictures for encoding. To your knowledge, are
> the various CinGG
> color-LUTs bypassing linearization by going directly from HDR-PQ or -HLG
> to SDR, for example?
>



Because I talked about cingg here I wish to clarify that we only have
colorspace/range settings in GUI, realized by swscale and internal
functions.

Sadly HDR specific functionality was not implemented, so for now I mostly
experiment around with various supported ffmpeg (libavfilter) video filters
in combination with native plugins.

From my understanding working color space inside cingg IS linear, otherwise
compositing modes between video tracks will simply not work as intended.

There are few modes, yuv444 8bpc, yuva4444 8bpc, rgb888, rgba8888,
rgbf323232, rgbaf32323232.

So, no halfs (16fp) no 16 bit per channel ints either (only temporal
buffers use this format  during transfer from ffmpeg's decoders to working
buffers.)

Another fork of cinelerra went bold with removing anything but 16 bpc
working internal format, but I think this is too drastic.

But for Windows  HDR to SDR workflow I think Avidemux 2.8.1 might be
useful, it does have some (software) tonemapper options.

Handbrake  also should have (software) tonemapper with few gui settings, so
you might try it too.

>
> The fish are swimming around in my brain and they're beginning to eat each
> other.
> _______________________________________________
> ffmpeg-user mailing list
> ffmpeg-user@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-user
>
> To unsubscribe, visit link above, or email
> ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".
>
_______________________________________________
ffmpeg-user mailing list
ffmpeg-user@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-user

To unsubscribe, visit link above, or email
ffmpeg-user-requ...@ffmpeg.org with subject "unsubscribe".

Reply via email to