Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-05-19 Thread Pekka Paalanen
On Wed, 19 May 2021 11:53:37 +0300
Pekka Paalanen  wrote:

...

> TL;DR:
> 
> I would summarise my comments so far into these:
> 
> - Telling the kernel the color spaces and letting it come up with
>   whatever color transformation formula from those is not enough,
>   because it puts the render intent policy decision in the kernel.
> 
> - Telling the kernel what color transformations need to be done is
>   good, if it is clearly defined.
> 
> - Using an enum-based UAPI to tell the kernel what color
>   transformations needs to be done (e.g. which EOTF or EOTF^-1 to apply
>   at a step in the abstract pipeline) is very likely ok for many
>   Wayland compositors in most cases, but may not be sufficient for all
>   use cases. Of course, one is always bound by what hardware can do, so
>   not a big deal.
> 
> - You may need to define mutually exclusive KMS properties (referring
>   to my email in another branch of this email tree).
> 
> - I'm not sure I (we?) can meaningfully review things like "SDR boost"
>   property until we know ourselves how to composite different types of
>   content together. Maybe someone else could.
> 
> Does this help or raise thoughts?
> 
> The work on Weston CM right now is aiming to get it up to a point
> where we can start nicely testing different compositing approaches and
> methods and parameters, and I expect that will also feed back into the
> Wayland CM protocol design as well.

I have forgot to mention one important thing:

Generic Wayland compositors will be using KMS planes opportunistically.
The compositor will be switching between GL and KMS compositing
on-demand, refresh by refresh. This means that both GL and KMS
compositing must produce identical results, or users will be seeing
"color flicks" on switch.

This is a practical reason why we really want to know in full detail
how the KMS pipeline processes pixels.


Thanks,
pq


pgpNdpbjuU5k0.pgp
Description: OpenPGP digital signature
___
amd-gfx mailing list
amd-gfx@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/amd-gfx


Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-05-19 Thread Pekka Paalanen
On Tue, 18 May 2021 10:19:25 -0400
Harry Wentland  wrote:

> On 2021-05-18 3:56 a.m., Pekka Paalanen wrote:
> > On Mon, 17 May 2021 15:39:03 -0400
> > Vitaly Prosyak  wrote:
> >   
> >> On 2021-05-17 12:48 p.m., Sebastian Wick wrote:  

...

> >>> I suspect that this is not about tone mapping at all. The use cases
> >>> listed always have the display in PQ mode and just assume that no
> >>> content exceeds the PQ limitations. Then you can simply bring all
> >>> content to the color space with a matrix multiplication and then map the
> >>> linear light content somewhere into the PQ range. Tone mapping is
> >>> performed in the display only.  
> > 
> > The use cases do use the word "desktop" though. Harry, could you expand
> > on this, are you seeking a design that is good for generic desktop
> > compositors too, or one that is more tailored to "embedded" video
> > player systems taking the most advantage of (potentially
> > fixed-function) hardware?
> >   
> 
> The goal is to enable this on a generic desktop, such as generic Wayland
> implementations or ChromeOS. We're not looking for a custom solution for
> some embedded systems, though the solution we end up with should obviously
> not prevent an implementation on embedded video players.

(There is a TL;DR: at the end.)

Echoing a little bit what Sebastian already said, I believe there are
two sides to this again:
- color management in the traditional sense
- modern standardised display technology

It was perhaps too harsh to say that generic Wayland compositors cannot
use enum-based color-related UAPI. Sometimes they could, sometimes it
won't be good enough.

Traditional color management assumes that no two monitors are the same,
even if they are the same make, model, and manufacturing batch, and are
driven exactly the same way. Hence, all monitors may require
calibration (adjusting monitor knobs), and/or they may require
profiling (measuring the light emission with a special hardware device
designed for that). Also the viewing environment has an effect.

For profiling to be at all meaningful, calibration must be fixed. This
means that there must be no dynamic on-the-fly adaptation done in the
monitor, in the display hardware, or in the kernel. That is a tall
order that I guess is going to be less and less achievable, especially
with HDR monitors.

The other side is where the end user trusts the standards, and trusts
that the drivers and the hardware do what they are specified to do.
This is where you can trust that the monitor does the tone-mapping magic
right.

Weston needs to support both approaches, because we want to prove our
new approach to traditional color management, but we also want to
support HDR, and if possible, do both at the same time. Doing both at
the same time is what we think foremost, because it's also the hardest
thing to achieve. If that can be done, then everything else works out
too.

However, this should not exclude the possibility to trust standards and
monitor magic, when the end user wants it.

It's also possible that a monitor simply doesn't support a mode that
would enable fully color managed HDR, so Weston will need to be able to
drive monitors with e.g. BT.2020/PQ data eventually. It's just not the
first goal we have.

This debate is a little bit ironic. The Wayland approach to traditional
color management is that end users should trust the display server to
do the right thing, where before people only trusted the individual
apps using a specific CMS implementation. The display server was the
untrusted one that should just get out of the way and not touch
anything. Now I'm arguing that I don't want to trust monitor magic, who
knows what atrocities it does to my picture! But take the next logical
step, and one would be arguing that end users should trust also
monitors to do the right thing. :-)

The above has two catches:

- Do you actually trust hardware manufacturers and marketers and EDID?
  Monitors have secret sauce you can't inspect nor change.

- You feed a single video stream to a monitor, in a single format,
  encoding and color space. The display server OTOH gets an arbitrary
  number of input video streams in arbitrary formats, encodings, and
  color spaces, and it needs to composite them into one.

Composition is hard. It's not enough to know what kind of signals you
take in and what kind of signal you must output. You also need to know
what the end user wants from the result: the render intent.

Even if we trust the monitor magic to do the right thing in
interpreting and displaying our output signal, we still need to know
what the end user wants from the composition, and we need to control
the composition formula to achieve that.

TL;DR:

I would summarise my comments so far into these:

- Telling the kernel the color spaces and letting it come up with
  whatever color transformation formula from those is not enough,
  because it puts the render intent policy decision in the kernel.

- Telling the kernel what 

Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-05-18 Thread Sebastian Wick

On 2021-05-18 16:19, Harry Wentland wrote:

On 2021-05-18 3:56 a.m., Pekka Paalanen wrote:

On Mon, 17 May 2021 15:39:03 -0400
Vitaly Prosyak  wrote:


On 2021-05-17 12:48 p.m., Sebastian Wick wrote:

On 2021-05-17 10:57, Pekka Paalanen wrote:

On Fri, 14 May 2021 17:05:11 -0400
Harry Wentland  wrote:


On 2021-04-27 10:50 a.m., Pekka Paalanen wrote:

On Mon, 26 Apr 2021 13:38:49 -0400
Harry Wentland  wrote:


...


## Mastering Luminances

Now we are able to use the PQ 2084 EOTF to define the luminance 
of
pixels in absolute terms. Unfortunately we're again presented 
with

physical limitations of the display technologies on the market

today.

Here are a few examples of luminance ranges of displays.

| Display  | Luminance range in nits |
|  | --- |
| Typical PC display   | 0.3 - 200 |
| Excellent LCD HDTV   | 0.3 - 400 |
| HDR LCD w/ local dimming | 0.05 - 1,500 |

Since no display can currently show the full 0.0005 to 10,000 
nits
luminance range the display will need to tonemap the HDR 
content,

i.e
to fit the content within a display's capabilities. To assist 
with
tonemapping HDR content is usually accompanied with a metadata 
that

describes (among other things) the minimum and maximum mastering
luminance, i.e. the maximum and minimum luminance of the display

that

was used to master the HDR content.

The HDR metadata is currently defined on the drm_connector via 
the

hdr_output_metadata blob property.

It might be useful to define per-plane hdr metadata, as 
different

planes might have been mastered differently.


I don't think this would directly help with the dynamic range

blending
problem. You still need to establish the mapping between the 
optical
values from two different EOTFs and dynamic ranges. Or can you 
know
which optical values match the mastering display maximum and 
minimum

luminances for not-PQ?



My understanding of this is probably best illustrated by this 
example:


Assume HDR was mastered on a display with a maximum white level of 
500
nits and played back on a display that supports a max white level 
of

400
nits. If you know the mastering white level of 500 you know that
this is
the maximum value you need to compress down to 400 nits, allowing
you to
use the full extent of the 400 nits panel.


Right, but in the kernel, where do you get these nits values from?

hdr_output_metadata blob is infoframe data to the monitor. I think 
this
should be independent of the metadata used for color 
transformations in

the display pipeline before the monitor.

EDID may tell us the monitor HDR metadata, but again what is used 
in

the color transformations should be independent, because EDIDs lie,
lighting environments change, and users have different preferences.

What about black levels?

Do you want to do black level adjustment?

How exactly should the compression work?

Where do you map the mid-tones?

What if the end user wants something different?


I suspect that this is not about tone mapping at all. The use cases
listed always have the display in PQ mode and just assume that no
content exceeds the PQ limitations. Then you can simply bring all
content to the color space with a matrix multiplication and then map 
the

linear light content somewhere into the PQ range. Tone mapping is
performed in the display only.


The use cases do use the word "desktop" though. Harry, could you 
expand

on this, are you seeking a design that is good for generic desktop
compositors too, or one that is more tailored to "embedded" video
player systems taking the most advantage of (potentially
fixed-function) hardware?



The goal is to enable this on a generic desktop, such as generic 
Wayland
implementations or ChromeOS. We're not looking for a custom solution 
for
some embedded systems, though the solution we end up with should 
obviously

not prevent an implementation on embedded video players.


What matrix would one choose? Which render intent would it
correspond to?

If you need to adapt different dynamic ranges into the blending 
dynamic

range, would a simple linear transformation really be enough?

From a generic wayland compositor point of view this is 
uninteresting.



It a compositor's decision to provide or not the metadata property to
the kernel. The metadata can be available from one or multiple 
clients

or most likely not available at all.

Compositors may put a display in HDR10 ( when PQ 2084 INV EOTF and TM
occurs in display ) or NATIVE mode and do not attach any metadata to 
the

connector and do TM in compositor.

It is all about user preference or compositor design, or a 
combination

of both options.


Indeed. The thing here is that you cannot just add KMS UAPI, you also
need to have the FOSS userspace to go with it. So you need to have 
your

audience defined, userspace patches written and reviewed and agreed
to be a good idea. I'm afraid this particular UAPI design would be
difficult to justify with 

Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-05-18 Thread Harry Wentland


On 2021-05-18 3:56 a.m., Pekka Paalanen wrote:
> On Mon, 17 May 2021 15:39:03 -0400
> Vitaly Prosyak  wrote:
> 
>> On 2021-05-17 12:48 p.m., Sebastian Wick wrote:
>>> On 2021-05-17 10:57, Pekka Paalanen wrote:  
 On Fri, 14 May 2021 17:05:11 -0400
 Harry Wentland  wrote:
  
> On 2021-04-27 10:50 a.m., Pekka Paalanen wrote:  
>> On Mon, 26 Apr 2021 13:38:49 -0400
>> Harry Wentland  wrote:  

 ...
  
>>> ## Mastering Luminances
>>>
>>> Now we are able to use the PQ 2084 EOTF to define the luminance of
>>> pixels in absolute terms. Unfortunately we're again presented with
>>> physical limitations of the display technologies on the market   
> today.  
>>> Here are a few examples of luminance ranges of displays.
>>>
>>> | Display  | Luminance range in nits |
>>> |  | --- |
>>> | Typical PC display   | 0.3 - 200 |
>>> | Excellent LCD HDTV   | 0.3 - 400 |
>>> | HDR LCD w/ local dimming | 0.05 - 1,500 |
>>>
>>> Since no display can currently show the full 0.0005 to 10,000 nits
>>> luminance range the display will need to tonemap the HDR content,   
> i.e  
>>> to fit the content within a display's capabilities. To assist with
>>> tonemapping HDR content is usually accompanied with a metadata that
>>> describes (among other things) the minimum and maximum mastering
>>> luminance, i.e. the maximum and minimum luminance of the display   
> that  
>>> was used to master the HDR content.
>>>
>>> The HDR metadata is currently defined on the drm_connector via the
>>> hdr_output_metadata blob property.
>>>
>>> It might be useful to define per-plane hdr metadata, as different
>>> planes might have been mastered differently.  
>>
>> I don't think this would directly help with the dynamic range   
> blending  
>> problem. You still need to establish the mapping between the optical
>> values from two different EOTFs and dynamic ranges. Or can you know
>> which optical values match the mastering display maximum and minimum
>> luminances for not-PQ?
>>  
>
> My understanding of this is probably best illustrated by this example:
>
> Assume HDR was mastered on a display with a maximum white level of 500
> nits and played back on a display that supports a max white level of 
> 400
> nits. If you know the mastering white level of 500 you know that 
> this is
> the maximum value you need to compress down to 400 nits, allowing 
> you to
> use the full extent of the 400 nits panel.  

 Right, but in the kernel, where do you get these nits values from?

 hdr_output_metadata blob is infoframe data to the monitor. I think this
 should be independent of the metadata used for color transformations in
 the display pipeline before the monitor.

 EDID may tell us the monitor HDR metadata, but again what is used in
 the color transformations should be independent, because EDIDs lie,
 lighting environments change, and users have different preferences.

 What about black levels?

 Do you want to do black level adjustment?

 How exactly should the compression work?

 Where do you map the mid-tones?

 What if the end user wants something different?  
>>>
>>> I suspect that this is not about tone mapping at all. The use cases
>>> listed always have the display in PQ mode and just assume that no
>>> content exceeds the PQ limitations. Then you can simply bring all
>>> content to the color space with a matrix multiplication and then map the
>>> linear light content somewhere into the PQ range. Tone mapping is
>>> performed in the display only.
> 
> The use cases do use the word "desktop" though. Harry, could you expand
> on this, are you seeking a design that is good for generic desktop
> compositors too, or one that is more tailored to "embedded" video
> player systems taking the most advantage of (potentially
> fixed-function) hardware?
> 

The goal is to enable this on a generic desktop, such as generic Wayland
implementations or ChromeOS. We're not looking for a custom solution for
some embedded systems, though the solution we end up with should obviously
not prevent an implementation on embedded video players.

> What matrix would one choose? Which render intent would it
> correspond to?
> 
> If you need to adapt different dynamic ranges into the blending dynamic
> range, would a simple linear transformation really be enough?
> 
>>> From a generic wayland compositor point of view this is uninteresting.
>>>  
>> It a compositor's decision to provide or not the metadata property to 
>> the kernel. The metadata can be available from one or multiple clients 
>> or most likely not available at all.
>>
>> Compositors may put a display in HDR10 ( when PQ 2084 INV EOTF and TM 
>> 

Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-05-18 Thread Pekka Paalanen
On Mon, 17 May 2021 15:39:03 -0400
Vitaly Prosyak  wrote:

> On 2021-05-17 12:48 p.m., Sebastian Wick wrote:
> > On 2021-05-17 10:57, Pekka Paalanen wrote:  
> >> On Fri, 14 May 2021 17:05:11 -0400
> >> Harry Wentland  wrote:
> >>  
> >>> On 2021-04-27 10:50 a.m., Pekka Paalanen wrote:  
> >>> > On Mon, 26 Apr 2021 13:38:49 -0400
> >>> > Harry Wentland  wrote:  
> >>
> >> ...
> >>  
> >>> >> ## Mastering Luminances
> >>> >>
> >>> >> Now we are able to use the PQ 2084 EOTF to define the luminance of
> >>> >> pixels in absolute terms. Unfortunately we're again presented with
> >>> >> physical limitations of the display technologies on the market   
> >>> today.  
> >>> >> Here are a few examples of luminance ranges of displays.
> >>> >>
> >>> >> | Display  | Luminance range in nits |
> >>> >> |  | --- |
> >>> >> | Typical PC display   | 0.3 - 200 |
> >>> >> | Excellent LCD HDTV   | 0.3 - 400 |
> >>> >> | HDR LCD w/ local dimming | 0.05 - 1,500 |
> >>> >>
> >>> >> Since no display can currently show the full 0.0005 to 10,000 nits
> >>> >> luminance range the display will need to tonemap the HDR content,   
> >>> i.e  
> >>> >> to fit the content within a display's capabilities. To assist with
> >>> >> tonemapping HDR content is usually accompanied with a metadata that
> >>> >> describes (among other things) the minimum and maximum mastering
> >>> >> luminance, i.e. the maximum and minimum luminance of the display   
> >>> that  
> >>> >> was used to master the HDR content.
> >>> >>
> >>> >> The HDR metadata is currently defined on the drm_connector via the
> >>> >> hdr_output_metadata blob property.
> >>> >>
> >>> >> It might be useful to define per-plane hdr metadata, as different
> >>> >> planes might have been mastered differently.  
> >>> >
> >>> > I don't think this would directly help with the dynamic range   
> >>> blending  
> >>> > problem. You still need to establish the mapping between the optical
> >>> > values from two different EOTFs and dynamic ranges. Or can you know
> >>> > which optical values match the mastering display maximum and minimum
> >>> > luminances for not-PQ?
> >>> >  
> >>>
> >>> My understanding of this is probably best illustrated by this example:
> >>>
> >>> Assume HDR was mastered on a display with a maximum white level of 500
> >>> nits and played back on a display that supports a max white level of 
> >>> 400
> >>> nits. If you know the mastering white level of 500 you know that 
> >>> this is
> >>> the maximum value you need to compress down to 400 nits, allowing 
> >>> you to
> >>> use the full extent of the 400 nits panel.  
> >>
> >> Right, but in the kernel, where do you get these nits values from?
> >>
> >> hdr_output_metadata blob is infoframe data to the monitor. I think this
> >> should be independent of the metadata used for color transformations in
> >> the display pipeline before the monitor.
> >>
> >> EDID may tell us the monitor HDR metadata, but again what is used in
> >> the color transformations should be independent, because EDIDs lie,
> >> lighting environments change, and users have different preferences.
> >>
> >> What about black levels?
> >>
> >> Do you want to do black level adjustment?
> >>
> >> How exactly should the compression work?
> >>
> >> Where do you map the mid-tones?
> >>
> >> What if the end user wants something different?  
> >
> > I suspect that this is not about tone mapping at all. The use cases
> > listed always have the display in PQ mode and just assume that no
> > content exceeds the PQ limitations. Then you can simply bring all
> > content to the color space with a matrix multiplication and then map the
> > linear light content somewhere into the PQ range. Tone mapping is
> > performed in the display only.

The use cases do use the word "desktop" though. Harry, could you expand
on this, are you seeking a design that is good for generic desktop
compositors too, or one that is more tailored to "embedded" video
player systems taking the most advantage of (potentially
fixed-function) hardware?

What matrix would one choose? Which render intent would it
correspond to?

If you need to adapt different dynamic ranges into the blending dynamic
range, would a simple linear transformation really be enough?

> > From a generic wayland compositor point of view this is uninteresting.
> >  
> It a compositor's decision to provide or not the metadata property to 
> the kernel. The metadata can be available from one or multiple clients 
> or most likely not available at all.
> 
> Compositors may put a display in HDR10 ( when PQ 2084 INV EOTF and TM 
> occurs in display ) or NATIVE mode and do not attach any metadata to the 
> connector and do TM in compositor.
> 
> It is all about user preference or compositor design, or a combination 
> of both options.

Indeed. The thing here is that you cannot just add KMS UAPI, you also
need to have the FOSS userspace to go with it. So you 

Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-05-17 Thread Vitaly Prosyak


On 2021-05-17 12:48 p.m., Sebastian Wick wrote:

On 2021-05-17 10:57, Pekka Paalanen wrote:

On Fri, 14 May 2021 17:05:11 -0400
Harry Wentland  wrote:


On 2021-04-27 10:50 a.m., Pekka Paalanen wrote:
> On Mon, 26 Apr 2021 13:38:49 -0400
> Harry Wentland  wrote:


...


>> ## Mastering Luminances
>>
>> Now we are able to use the PQ 2084 EOTF to define the luminance of
>> pixels in absolute terms. Unfortunately we're again presented with
>> physical limitations of the display technologies on the market 
today.

>> Here are a few examples of luminance ranges of displays.
>>
>> | Display  | Luminance range in nits |
>> |  | --- |
>> | Typical PC display   | 0.3 - 200 |
>> | Excellent LCD HDTV   | 0.3 - 400 |
>> | HDR LCD w/ local dimming | 0.05 - 1,500 |
>>
>> Since no display can currently show the full 0.0005 to 10,000 nits
>> luminance range the display will need to tonemap the HDR content, 
i.e

>> to fit the content within a display's capabilities. To assist with
>> tonemapping HDR content is usually accompanied with a metadata that
>> describes (among other things) the minimum and maximum mastering
>> luminance, i.e. the maximum and minimum luminance of the display 
that

>> was used to master the HDR content.
>>
>> The HDR metadata is currently defined on the drm_connector via the
>> hdr_output_metadata blob property.
>>
>> It might be useful to define per-plane hdr metadata, as different
>> planes might have been mastered differently.
>
> I don't think this would directly help with the dynamic range 
blending

> problem. You still need to establish the mapping between the optical
> values from two different EOTFs and dynamic ranges. Or can you know
> which optical values match the mastering display maximum and minimum
> luminances for not-PQ?
>

My understanding of this is probably best illustrated by this example:

Assume HDR was mastered on a display with a maximum white level of 500
nits and played back on a display that supports a max white level of 
400
nits. If you know the mastering white level of 500 you know that 
this is
the maximum value you need to compress down to 400 nits, allowing 
you to

use the full extent of the 400 nits panel.


Right, but in the kernel, where do you get these nits values from?

hdr_output_metadata blob is infoframe data to the monitor. I think this
should be independent of the metadata used for color transformations in
the display pipeline before the monitor.

EDID may tell us the monitor HDR metadata, but again what is used in
the color transformations should be independent, because EDIDs lie,
lighting environments change, and users have different preferences.

What about black levels?

Do you want to do black level adjustment?

How exactly should the compression work?

Where do you map the mid-tones?

What if the end user wants something different?


I suspect that this is not about tone mapping at all. The use cases
listed always have the display in PQ mode and just assume that no
content exceeds the PQ limitations. Then you can simply bring all
content to the color space with a matrix multiplication and then map the
linear light content somewhere into the PQ range. Tone mapping is
performed in the display only.

From a generic wayland compositor point of view this is uninteresting.

It a compositor's decision to provide or not the metadata property to 
the kernel. The metadata can be available from one or multiple clients 
or most likely not available at all.


Compositors may put a display in HDR10 ( when PQ 2084 INV EOTF and TM 
occurs in display ) or NATIVE mode and do not attach any metadata to the 
connector and do TM in compositor.


It is all about user preference or compositor design, or a combination 
of both options.




I completely agree with what you said below though. I would even argue
that all generic KMS abstract pipeline elements must have a well defined
place in the pipeline and follow an exact specified formula.




If you do not know the mastering luminance is 500 nits you would
have to compress 10,000 nits down to 400 (assuming PQ), losing quite
a bit of the full 400 nits available as you'll need room to map the 500
to 10,000 nits range which in reality is completely unused. You 
might end

up with mapping 500 nits to 350 nits, instead of mapping it to 400.


The quality of the result depends on the compression (tone-mapping)
algorithm. I believe no-one will ever do a simple linear compression of
ranges.

Instead, you probably do something smooth in the black end, keep
mid-tones roughly as they are, and the again do a smooth saturation to
some "reasonable" level that goes well with the monitor in the current
lighting environment without introducing coloring artifacts, and just
clip the huge overshoot of the full PQ-range.

There are many big and small decisions to be made in how to map
out-of-gamut or out-of-brightness values into the displayable range,
and no 

Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-05-17 Thread Sebastian Wick

On 2021-05-17 10:57, Pekka Paalanen wrote:

On Fri, 14 May 2021 17:05:11 -0400
Harry Wentland  wrote:


On 2021-04-27 10:50 a.m., Pekka Paalanen wrote:
> On Mon, 26 Apr 2021 13:38:49 -0400
> Harry Wentland  wrote:


...


>> ## Mastering Luminances
>>
>> Now we are able to use the PQ 2084 EOTF to define the luminance of
>> pixels in absolute terms. Unfortunately we're again presented with
>> physical limitations of the display technologies on the market today.
>> Here are a few examples of luminance ranges of displays.
>>
>> | Display  | Luminance range in nits |
>> |  | --- |
>> | Typical PC display   | 0.3 - 200   |
>> | Excellent LCD HDTV   | 0.3 - 400   |
>> | HDR LCD w/ local dimming | 0.05 - 1,500|
>>
>> Since no display can currently show the full 0.0005 to 10,000 nits
>> luminance range the display will need to tonemap the HDR content, i.e
>> to fit the content within a display's capabilities. To assist with
>> tonemapping HDR content is usually accompanied with a metadata that
>> describes (among other things) the minimum and maximum mastering
>> luminance, i.e. the maximum and minimum luminance of the display that
>> was used to master the HDR content.
>>
>> The HDR metadata is currently defined on the drm_connector via the
>> hdr_output_metadata blob property.
>>
>> It might be useful to define per-plane hdr metadata, as different
>> planes might have been mastered differently.
>
> I don't think this would directly help with the dynamic range blending
> problem. You still need to establish the mapping between the optical
> values from two different EOTFs and dynamic ranges. Or can you know
> which optical values match the mastering display maximum and minimum
> luminances for not-PQ?
>

My understanding of this is probably best illustrated by this example:

Assume HDR was mastered on a display with a maximum white level of 500
nits and played back on a display that supports a max white level of 
400
nits. If you know the mastering white level of 500 you know that this 
is
the maximum value you need to compress down to 400 nits, allowing you 
to

use the full extent of the 400 nits panel.


Right, but in the kernel, where do you get these nits values from?

hdr_output_metadata blob is infoframe data to the monitor. I think this
should be independent of the metadata used for color transformations in
the display pipeline before the monitor.

EDID may tell us the monitor HDR metadata, but again what is used in
the color transformations should be independent, because EDIDs lie,
lighting environments change, and users have different preferences.

What about black levels?

Do you want to do black level adjustment?

How exactly should the compression work?

Where do you map the mid-tones?

What if the end user wants something different?


I suspect that this is not about tone mapping at all. The use cases
listed always have the display in PQ mode and just assume that no
content exceeds the PQ limitations. Then you can simply bring all
content to the color space with a matrix multiplication and then map the
linear light content somewhere into the PQ range. Tone mapping is
performed in the display only.

From a generic wayland compositor point of view this is uninteresting.

I completely agree with what you said below though. I would even argue
that all generic KMS abstract pipeline elements must have a well defined
place in the pipeline and follow an exact specified formula.




If you do not know the mastering luminance is 500 nits you would
have to compress 10,000 nits down to 400 (assuming PQ), losing quite
a bit of the full 400 nits available as you'll need room to map the 
500
to 10,000 nits range which in reality is completely unused. You might 
end

up with mapping 500 nits to 350 nits, instead of mapping it to 400.


The quality of the result depends on the compression (tone-mapping)
algorithm. I believe no-one will ever do a simple linear compression of
ranges.

Instead, you probably do something smooth in the black end, keep
mid-tones roughly as they are, and the again do a smooth saturation to
some "reasonable" level that goes well with the monitor in the current
lighting environment without introducing coloring artifacts, and just
clip the huge overshoot of the full PQ-range.

There are many big and small decisions to be made in how to map
out-of-gamut or out-of-brightness values into the displayable range,
and no algorithm fits all use cases. I believe this is why e.g. ICC
has several different "render intents", some of which are so vaguely
defined that they are practically undefined - just like what "a good
picture" means. You have essentially three dimensions: luminance, hue,
and saturation. Which one will you sacrifice, shift or emphasize and to
what degree to fit the square content peg into the round display hole?

A naive example: Let's say content has 300 nits red. Your display can

Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-05-17 Thread Pekka Paalanen
On Fri, 14 May 2021 17:05:11 -0400
Harry Wentland  wrote:

> On 2021-04-27 10:50 a.m., Pekka Paalanen wrote:
> > On Mon, 26 Apr 2021 13:38:49 -0400
> > Harry Wentland  wrote:

...

> >> ## Mastering Luminances
> >>
> >> Now we are able to use the PQ 2084 EOTF to define the luminance of
> >> pixels in absolute terms. Unfortunately we're again presented with
> >> physical limitations of the display technologies on the market today.
> >> Here are a few examples of luminance ranges of displays.
> >>
> >> | Display  | Luminance range in nits |
> >> |  | --- |
> >> | Typical PC display   | 0.3 - 200   |
> >> | Excellent LCD HDTV   | 0.3 - 400   |
> >> | HDR LCD w/ local dimming | 0.05 - 1,500|
> >>
> >> Since no display can currently show the full 0.0005 to 10,000 nits
> >> luminance range the display will need to tonemap the HDR content, i.e
> >> to fit the content within a display's capabilities. To assist with
> >> tonemapping HDR content is usually accompanied with a metadata that
> >> describes (among other things) the minimum and maximum mastering
> >> luminance, i.e. the maximum and minimum luminance of the display that
> >> was used to master the HDR content.
> >>
> >> The HDR metadata is currently defined on the drm_connector via the
> >> hdr_output_metadata blob property.
> >>
> >> It might be useful to define per-plane hdr metadata, as different
> >> planes might have been mastered differently.  
> > 
> > I don't think this would directly help with the dynamic range blending
> > problem. You still need to establish the mapping between the optical
> > values from two different EOTFs and dynamic ranges. Or can you know
> > which optical values match the mastering display maximum and minimum
> > luminances for not-PQ?
> >   
> 
> My understanding of this is probably best illustrated by this example:
> 
> Assume HDR was mastered on a display with a maximum white level of 500
> nits and played back on a display that supports a max white level of 400
> nits. If you know the mastering white level of 500 you know that this is
> the maximum value you need to compress down to 400 nits, allowing you to
> use the full extent of the 400 nits panel.

Right, but in the kernel, where do you get these nits values from?

hdr_output_metadata blob is infoframe data to the monitor. I think this
should be independent of the metadata used for color transformations in
the display pipeline before the monitor.

EDID may tell us the monitor HDR metadata, but again what is used in
the color transformations should be independent, because EDIDs lie,
lighting environments change, and users have different preferences.

What about black levels?

Do you want to do black level adjustment?

How exactly should the compression work?

Where do you map the mid-tones?

What if the end user wants something different?

> If you do not know the mastering luminance is 500 nits you would
> have to compress 10,000 nits down to 400 (assuming PQ), losing quite
> a bit of the full 400 nits available as you'll need room to map the 500
> to 10,000 nits range which in reality is completely unused. You might end
> up with mapping 500 nits to 350 nits, instead of mapping it to 400.

The quality of the result depends on the compression (tone-mapping)
algorithm. I believe no-one will ever do a simple linear compression of
ranges.

Instead, you probably do something smooth in the black end, keep
mid-tones roughly as they are, and the again do a smooth saturation to
some "reasonable" level that goes well with the monitor in the current
lighting environment without introducing coloring artifacts, and just
clip the huge overshoot of the full PQ-range.

There are many big and small decisions to be made in how to map
out-of-gamut or out-of-brightness values into the displayable range,
and no algorithm fits all use cases. I believe this is why e.g. ICC
has several different "render intents", some of which are so vaguely
defined that they are practically undefined - just like what "a good
picture" means. You have essentially three dimensions: luminance, hue,
and saturation. Which one will you sacrifice, shift or emphasize and to
what degree to fit the square content peg into the round display hole?

A naive example: Let's say content has 300 nits red. Your display can
show max 400 nits white or max 180 nits red, and anything in between.
What will you show?

The problem is, that if UAPI does not define exactly what happens, then
taking advantage of these hardware capabilities means you have no idea
what happens to your content. This may be fine for closed systems, where
the software has been carefully built for the specific hardware
revision and the use cases of the complete system have been
pre-defined, so that it can assume what should and will happen. But
should that be exposed as a generic UAPI, when generic userspace has no
chance of knowing what it will do?

Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-05-14 Thread Harry Wentland
On 2021-04-27 10:50 a.m., Pekka Paalanen wrote:
> On Mon, 26 Apr 2021 13:38:49 -0400
> Harry Wentland  wrote:
> 
>> ## Introduction
>>
>> We are looking to enable HDR support for a couple of single-plane and
>> multi-plane scenarios. To do this effectively we recommend new
>> interfaces to drm_plane. Below I'll give a bit of background on HDR
>> and why we propose these interfaces.
>>
>>
>> ## Defining a pixel's luminance
>>
>> Currently the luminance space of pixels in a framebuffer/plane
>> presented to the display is not well defined. It's usually assumed to
>> be in a 2.2 or 2.4 gamma space and has no mapping to an absolute
>> luminance value but is interpreted in relative terms.
>>
>> Luminance can be measured and described in absolute terms as candela
>> per meter squared, or cd/m2, or nits. Even though a pixel value can
>> be mapped to luminance in a linear fashion to do so without losing a
>> lot of detail requires 16-bpc color depth. The reason for this is
>> that human perception can distinguish roughly between a 0.5-1%
>> luminance delta. A linear representation is suboptimal, wasting
>> precision in the highlights and losing precision in the shadows.
>>
>> A gamma curve is a decent approximation to a human's perception of
>> luminance, but the PQ (perceptual quantizer) function [1] improves on
>> it. It also defines the luminance values in absolute terms, with the
>> highest value being 10,000 nits and the lowest 0.0005 nits.
>>
>> Using a content that's defined in PQ space we can approximate the
>> real world in a much better way.
>>
>> Here are some examples of real-life objects and their approximate
>> luminance values:
>>
>> | Object| Luminance in nits |
>> | - | - |
>> | Sun   | 1.6 million   |
>> | Fluorescent light | 10,000|
>> | Highlights| 1,000 - sunlight  |
>> | White Objects | 250 - 1,000   |
>> | Typical objects   | 1 - 250   |
>> | Shadows   | 0.01 - 1  |
>> | Ultra Blacks  | 0 - 0.0005|
>>
>>
>> ## Describing the luminance space
>>
>> **We propose a new drm_plane property to describe the Eletro-Optical
>> Transfer Function (EOTF) with which its framebuffer was composed.**
>> Examples of EOTF are:
>>
>> | EOTF  | Description
>>|
>> | - 
>> |:- |
>> | Gamma 2.2 | a simple 2.2 gamma 
>>|
>> | sRGB  | 2.4 gamma with small initial linear section
>>|
>> | PQ 2084   | SMPTE ST 2084; used for HDR video and allows for up to 10,000 
>> nit support |
>> | Linear| Linear relationship between pixel value and luminance value
>>|
>>
> 
> The definitions agree with what I have learnt so far. However, with
> these EOTF definitions, only PQ defines absolute luminance values
> while the others do not. So this is not enough information to blend
> planes together if they do not all use the same EOTF with the same
> dynamic range. More below.
> 

Good point.

> 
>>
>> ## Mastering Luminances
>>
>> Now we are able to use the PQ 2084 EOTF to define the luminance of
>> pixels in absolute terms. Unfortunately we're again presented with
>> physical limitations of the display technologies on the market today.
>> Here are a few examples of luminance ranges of displays.
>>
>> | Display  | Luminance range in nits |
>> |  | --- |
>> | Typical PC display   | 0.3 - 200   |
>> | Excellent LCD HDTV   | 0.3 - 400   |
>> | HDR LCD w/ local dimming | 0.05 - 1,500|
>>
>> Since no display can currently show the full 0.0005 to 10,000 nits
>> luminance range the display will need to tonemap the HDR content, i.e
>> to fit the content within a display's capabilities. To assist with
>> tonemapping HDR content is usually accompanied with a metadata that
>> describes (among other things) the minimum and maximum mastering
>> luminance, i.e. the maximum and minimum luminance of the display that
>> was used to master the HDR content.
>>
>> The HDR metadata is currently defined on the drm_connector via the
>> hdr_output_metadata blob property.
>>
>> It might be useful to define per-plane hdr metadata, as different
>> planes might have been mastered differently.
> 
> I don't think this would directly help with the dynamic range blending
> problem. You still need to establish the mapping between the optical
> values from two different EOTFs and dynamic ranges. Or can you know
> which optical values match the mastering display maximum and minimum
> luminances for not-PQ?
> 

My understanding of this is probably best illustrated by this example:

Assume HDR was mastered on a display with a maximum white level of 500
nits and played back on a display that 

Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-05-14 Thread Harry Wentland


On 2021-04-30 6:39 a.m., Shashank Sharma wrote:
> Hello Pekka,
> 
> On 30/04/21 15:13, Pekka Paalanen wrote:
>> On Wed, 28 Apr 2021 13:24:27 +0530
>> Shashank Sharma  wrote:
>>
>>> Assuming these details, A compositor will look for DRM color properties 
>>> like these:
>>>
>>> 1. Degamma plane property : To make buffers linear for Gamut mapping
>>>
>>> 2. Gamut mapping plane property:  To gamut map SRGB buffer to BT2020 
>>> colorspace
>>>
>>> 3. Color space conversion plane property: To convert from YCBCR->RGB
>>>
>>> 4. Tone mapping plane property: To tone map SDR buffer S2H and HDR buffer 
>>> H2H
>>>
>>> 5. Gamma plane/CRTC property: to re-apply the output ST2084 curve
>>>
>>>
>> ...
>>
>>>  *
>>>  *
>>>  *
>>>  * ┌─┐ ┌─┐  
>>>  ┌─┐   ┌┐
>>>  * HDR 600 Nits│ │HDR 600 Nits │ │HDR600
>>>  │ │HDR500 │    │ HDR500
>>>  *   ► │  Degamma    ├►│  Color space    
>>> ├──►│  Tone mapping   ├──►│  Gamma │
>>>  * BT2020  │  OETF ST2084    │ BT2020  │  conversion │BT2020
>>>  │   H2H   │BT2020 │  ST2084    │ BT2020
>>>  * YCBCR420    │ │ YCBCR420    │ YCBCR->RGB  │RGB88 
>>>  │   600->500  │RGB888 │    │ RGB888
>>>  * Non Linear  └─┘ Linear  └─┘Linear
>>>  └─┘Linear └┘ ST2084
>>>  */
>> Hi Shashank,
>>
>> I think you might have degamma and color model conversion reversed, or
>> is that a new thing in the HDR specs?
>>
>> Usually the YCbCr/RGB conversion matrix applies to non-linear values
>> AFAIU.
> Ah, that was due to the Gamut mapping block. You are right, color format 
> conversion can happen on non-linear data (doesn't mean it can't happen on 
> linear), but in the sequential block above, there was gamut mapping (color 
> space conversion), which needs to be done on Linear space, and I was a bit 
> too lazy to create separate blocks, so I just re[placed the block titles  :D.
>> There is also confusion with OETF vs. EOTF. I got that initially wrong
>> too. OETF is not just a name for inverse-EOTF but it is used in a
>> different context. Though here it seems to be just a typo.
>> OETF is inherent to a camera when it converts light into
>> electrical signals. EOTF is inherent to a monitor when it converts
>> electrical signals to light. Depending on what the electrical signals
>> have been defined to be in each step of a broadcasting chain, you might
>> need OETF or EOTF or their inverse or a different OETF or EOTF or their
>> inverse.
> 
> Yes, that was a typo. The intention was to call it inverse curve for HDR 
> encoded buffers. It's almost 4 years (and 2 companies) since I last did HDR, 
> so I am a bit rusty on the topic ;) .
> 
> - Shashank
> 

Thanks, Ville and Shashank. This is indeed helpful. I apologize for the late
response but I needed to take some time to do more reading and internalize some
of the HDR and CM concepts. I will send out a v2 of my patchset but realize
that it is only a small step toward the right KMS interface for HDR and CM.

Harry

>>
>> As we are talking about displays and likely assuming display-referred
>> content (not scene-referred content), we probably have no use for OETF,
>> but we could have several different EOTFs.
>>
>>
>> Thanks,
>> pq

___
amd-gfx mailing list
amd-gfx@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/amd-gfx


Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-04-30 Thread Shashank Sharma
Hello Pekka,

On 30/04/21 15:13, Pekka Paalanen wrote:
> On Wed, 28 Apr 2021 13:24:27 +0530
> Shashank Sharma  wrote:
>
>> Assuming these details, A compositor will look for DRM color properties like 
>> these:
>>
>> 1. Degamma plane property : To make buffers linear for Gamut mapping
>>
>> 2. Gamut mapping plane property:  To gamut map SRGB buffer to BT2020 
>> colorspace
>>
>> 3. Color space conversion plane property: To convert from YCBCR->RGB
>>
>> 4. Tone mapping plane property: To tone map SDR buffer S2H and HDR buffer H2H
>>
>> 5. Gamma plane/CRTC property: to re-apply the output ST2084 curve
>>
>>
> ...
>
>>  *
>>  *
>>  *
>>  * ┌─┐ ┌─┐   
>> ┌─┐   ┌┐
>>  * HDR 600 Nits│ │HDR 600 Nits │ │HDR600 
>> │ │HDR500 │    │ HDR500
>>  *   ► │  Degamma    ├►│  Color space    
>> ├──►│  Tone mapping   ├──►│  Gamma │
>>  * BT2020  │  OETF ST2084    │ BT2020  │  conversion │BT2020 
>> │   H2H   │BT2020 │  ST2084    │ BT2020
>>  * YCBCR420    │ │ YCBCR420    │ YCBCR->RGB  │RGB88  
>> │   600->500  │RGB888 │    │ RGB888
>>  * Non Linear  └─┘ Linear  └─┘Linear 
>> └─┘Linear └┘ ST2084
>>  */
> Hi Shashank,
>
> I think you might have degamma and color model conversion reversed, or
> is that a new thing in the HDR specs?
>
> Usually the YCbCr/RGB conversion matrix applies to non-linear values
> AFAIU.
Ah, that was due to the Gamut mapping block. You are right, color format 
conversion can happen on non-linear data (doesn't mean it can't happen on 
linear), but in the sequential block above, there was gamut mapping (color 
space conversion), which needs to be done on Linear space, and I was a bit too 
lazy to create separate blocks, so I just re[placed the block titles  :D.
> There is also confusion with OETF vs. EOTF. I got that initially wrong
> too. OETF is not just a name for inverse-EOTF but it is used in a
> different context. Though here it seems to be just a typo.
> OETF is inherent to a camera when it converts light into
> electrical signals. EOTF is inherent to a monitor when it converts
> electrical signals to light. Depending on what the electrical signals
> have been defined to be in each step of a broadcasting chain, you might
> need OETF or EOTF or their inverse or a different OETF or EOTF or their
> inverse.

Yes, that was a typo. The intention was to call it inverse curve for HDR 
encoded buffers. It's almost 4 years (and 2 companies) since I last did HDR, so 
I am a bit rusty on the topic ;) .

- Shashank

>
> As we are talking about displays and likely assuming display-referred
> content (not scene-referred content), we probably have no use for OETF,
> but we could have several different EOTFs.
>
>
> Thanks,
> pq
___
amd-gfx mailing list
amd-gfx@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/amd-gfx


Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-04-30 Thread Pekka Paalanen
On Wed, 28 Apr 2021 13:24:27 +0530
Shashank Sharma  wrote:

> Assuming these details, A compositor will look for DRM color properties like 
> these:
> 
> 1. Degamma plane property : To make buffers linear for Gamut mapping
> 
> 2. Gamut mapping plane property:  To gamut map SRGB buffer to BT2020 
> colorspace
> 
> 3. Color space conversion plane property: To convert from YCBCR->RGB
> 
> 4. Tone mapping plane property: To tone map SDR buffer S2H and HDR buffer H2H
> 
> 5. Gamma plane/CRTC property: to re-apply the output ST2084 curve
> 
> 

...

>  *
>  *
>  *
>  * ┌─┐ ┌─┐   
> ┌─┐   ┌┐
>  * HDR 600 Nits│ │HDR 600 Nits │ │HDR600 
> │ │HDR500 │    │ HDR500
>  *   ► │  Degamma    ├►│  Color space    
> ├──►│  Tone mapping   ├──►│  Gamma │
>  * BT2020  │  OETF ST2084    │ BT2020  │  conversion │BT2020 
> │   H2H   │BT2020 │  ST2084    │ BT2020
>  * YCBCR420    │ │ YCBCR420    │ YCBCR->RGB  │RGB88  
> │   600->500  │RGB888 │    │ RGB888
>  * Non Linear  └─┘ Linear  └─┘Linear 
> └─┘Linear └┘ ST2084
>  */

Hi Shashank,

I think you might have degamma and color model conversion reversed, or
is that a new thing in the HDR specs?

Usually the YCbCr/RGB conversion matrix applies to non-linear values
AFAIU.

There is also confusion with OETF vs. EOTF. I got that initially wrong
too. OETF is not just a name for inverse-EOTF but it is used in a
different context. Though here it seems to be just a typo.

OETF is inherent to a camera when it converts light into
electrical signals. EOTF is inherent to a monitor when it converts
electrical signals to light. Depending on what the electrical signals
have been defined to be in each step of a broadcasting chain, you might
need OETF or EOTF or their inverse or a different OETF or EOTF or their
inverse.

As we are talking about displays and likely assuming display-referred
content (not scene-referred content), we probably have no use for OETF,
but we could have several different EOTFs.


Thanks,
pq


pgpuIfHT4caXM.pgp
Description: OpenPGP digital signature
___
amd-gfx mailing list
amd-gfx@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/amd-gfx


Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-04-28 Thread Shashank Sharma
Hello Harry,

Many of us in the mail chain have discussed this before, on what is the right 
way to blend and tone map a SDR and a HDR buffer from same/different color 
spaces, and what kind of DRM plane properties will be needed.

As you can see from the previous comments, that the majority of the decision 
making will happen in the Compositor, as it's the only SW unit, which has the 
overall picture clear.

Reference: 
(https://lists.freedesktop.org/archives/wayland-devel/2019-January/039808.html )

If we see a systematic approach of how do we make such blending policy, it will 
look like:


- Compositor needs to understand the following values of each of the buffer:

    - Color space or Gamut: BT2020/SRGB/DCI-P3/BT709/BT601 etc

    - Color format (RGB/YCBCR) and subsampling (444/422/420)

    - Tone (SDR/HDR_A/HDR_B)


- Then the Compositor needs to understand the capabilities of the output 
display, as this will be a clamping value

    - Output Gamut support (BT2020/SRGB/DCIP3)

    - Output max Luminance of the monitor in Nits (even in case of HDR content 
to HDR display)

  

Based of all this information above, the compositor needs to set a blending 
target, which contains the following:

    - Output Colorspace of the blended output: say BT2020

    - Output Luminance of the blended output: Match content, if monitor can 
support it

    - Output Color format of the blended output: Say YCBCR4:2:0


Let's assume compositor prepares a blending policy with output as:

    - Output Luminance: HDR 500 Nits

    - Output color space: BT2020

    - Output color format: RGB888

    - Output curve: ST2084

  

Assuming these details, A compositor will look for DRM color properties like 
these:

1. Degamma plane property : To make buffers linear for Gamut mapping

2. Gamut mapping plane property:  To gamut map SRGB buffer to BT2020 colorspace

3. Color space conversion plane property: To convert from YCBCR->RGB

4. Tone mapping plane property: To tone map SDR buffer S2H and HDR buffer H2H

5. Gamma plane/CRTC property: to re-apply the output ST2084 curve


We will also need connector/CRTC properties to set AVI info-frames accordingly.

A high level block diagram for blending on a generic HW should look like this:

/*
 *  SDR 200Nits┌┐ SDR 200 Nits  ┌┐ SDR 200 
┌──┐HDR 500┌┐ HDR 500
 *   BT709 │    │ BT709 │    │ BT2020  │
  │BT2020 │    │ BT2020
 *   ► │   Degamma  ├─► │ Gamut Mapping  ├►│  
Tone mapping    ├──►│  Gamma │
 *  RGB888 │ 2.2    │ RGB888    │  709->2020 │ RGB888  │    
S2H   │RGB888 │  ST2084    │ RGB888
 *  Non Linear │    │ Linear    │    │ Linear  │   
200->500   │Linear │    │ ST2084
 * └┘   └┘ 
└──┘   └┘
 *
 *
 *
 *
 *
 *
 *
 *
 * ┌─┐ ┌─┐   
┌─┐   ┌┐
 * HDR 600 Nits│ │HDR 600 Nits │ │HDR600 │  
   │HDR500 │    │ HDR500
 *   ► │  Degamma    ├►│  Color space    ├──►│  
Tone mapping   ├──►│  Gamma │
 * BT2020  │  OETF ST2084    │ BT2020  │  conversion │BT2020 │  
 H2H   │BT2020 │  ST2084    │ BT2020
 * YCBCR420    │ │ YCBCR420    │ YCBCR->RGB  │RGB88  │  
 600->500  │RGB888 │    │ RGB888
 * Non Linear  └─┘ Linear  └─┘Linear 
└─┘Linear └┘ ST2084
 */


Hope this helps to refine the series.


Regards

Shashank

On 27/04/21 20:20, Pekka Paalanen wrote:
> On Mon, 26 Apr 2021 13:38:49 -0400
> Harry Wentland  wrote:
>
>> ## Introduction
>>
>> We are looking to enable HDR support for a couple of single-plane and
>> multi-plane scenarios. To do this effectively we recommend new
>> interfaces to drm_plane. Below I'll give a bit of background on HDR
>> and why we propose these interfaces.
>>
>>
>> ## Defining a pixel's luminance
>>
>> Currently the luminance space of pixels in a framebuffer/plane
>> presented to the display is not well defined. It's usually assumed to
>> be in a 2.2 or 2.4 gamma space and has no mapping to an absolute
>> luminance value but is interpreted in relative terms.
>>
>> Luminance can be measured and described in absolute terms as candela
>> per meter squared, or cd/m2, or nits. Even though a pixel value can
>> be mapped to luminance in a linear fashion to do so without losing a
>> lot of detail requires 16-bpc color depth. The reason for this is
>> that human perception can distinguish roughly between a 0.5-1%
>> luminance delta. A linear representation is suboptimal, wasting
>> precision in the 

Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-04-27 Thread Pekka Paalanen
On Mon, 26 Apr 2021 13:38:49 -0400
Harry Wentland  wrote:

> ## Introduction
> 
> We are looking to enable HDR support for a couple of single-plane and
> multi-plane scenarios. To do this effectively we recommend new
> interfaces to drm_plane. Below I'll give a bit of background on HDR
> and why we propose these interfaces.
> 
> 
> ## Defining a pixel's luminance
> 
> Currently the luminance space of pixels in a framebuffer/plane
> presented to the display is not well defined. It's usually assumed to
> be in a 2.2 or 2.4 gamma space and has no mapping to an absolute
> luminance value but is interpreted in relative terms.
> 
> Luminance can be measured and described in absolute terms as candela
> per meter squared, or cd/m2, or nits. Even though a pixel value can
> be mapped to luminance in a linear fashion to do so without losing a
> lot of detail requires 16-bpc color depth. The reason for this is
> that human perception can distinguish roughly between a 0.5-1%
> luminance delta. A linear representation is suboptimal, wasting
> precision in the highlights and losing precision in the shadows.
> 
> A gamma curve is a decent approximation to a human's perception of
> luminance, but the PQ (perceptual quantizer) function [1] improves on
> it. It also defines the luminance values in absolute terms, with the
> highest value being 10,000 nits and the lowest 0.0005 nits.
> 
> Using a content that's defined in PQ space we can approximate the
> real world in a much better way.
> 
> Here are some examples of real-life objects and their approximate
> luminance values:
> 
> | Object| Luminance in nits |
> | - | - |
> | Sun   | 1.6 million   |
> | Fluorescent light | 10,000|
> | Highlights| 1,000 - sunlight  |
> | White Objects | 250 - 1,000   |
> | Typical objects   | 1 - 250   |
> | Shadows   | 0.01 - 1  |
> | Ultra Blacks  | 0 - 0.0005|
> 
> 
> ## Describing the luminance space
> 
> **We propose a new drm_plane property to describe the Eletro-Optical
> Transfer Function (EOTF) with which its framebuffer was composed.**
> Examples of EOTF are:
> 
> | EOTF  | Description 
>   |
> | - 
> |:- |
> | Gamma 2.2 | a simple 2.2 gamma  
>   |
> | sRGB  | 2.4 gamma with small initial linear section 
>   |
> | PQ 2084   | SMPTE ST 2084; used for HDR video and allows for up to 10,000 
> nit support |
> | Linear| Linear relationship between pixel value and luminance value 
>   |
> 

The definitions agree with what I have learnt so far. However, with
these EOTF definitions, only PQ defines absolute luminance values
while the others do not. So this is not enough information to blend
planes together if they do not all use the same EOTF with the same
dynamic range. More below.


> 
> ## Mastering Luminances
> 
> Now we are able to use the PQ 2084 EOTF to define the luminance of
> pixels in absolute terms. Unfortunately we're again presented with
> physical limitations of the display technologies on the market today.
> Here are a few examples of luminance ranges of displays.
> 
> | Display  | Luminance range in nits |
> |  | --- |
> | Typical PC display   | 0.3 - 200   |
> | Excellent LCD HDTV   | 0.3 - 400   |
> | HDR LCD w/ local dimming | 0.05 - 1,500|
> 
> Since no display can currently show the full 0.0005 to 10,000 nits
> luminance range the display will need to tonemap the HDR content, i.e
> to fit the content within a display's capabilities. To assist with
> tonemapping HDR content is usually accompanied with a metadata that
> describes (among other things) the minimum and maximum mastering
> luminance, i.e. the maximum and minimum luminance of the display that
> was used to master the HDR content.
> 
> The HDR metadata is currently defined on the drm_connector via the
> hdr_output_metadata blob property.
> 
> It might be useful to define per-plane hdr metadata, as different
> planes might have been mastered differently.

I don't think this would directly help with the dynamic range blending
problem. You still need to establish the mapping between the optical
values from two different EOTFs and dynamic ranges. Or can you know
which optical values match the mastering display maximum and minimum
luminances for not-PQ?


> ## SDR Luminance
> 
> Since SDR covers a smaller luminance range than HDR, an SDR plane
> might look dark when blended with HDR content. Since the max HDR
> luminance can be quite variable (200-1,500 nits on actual displays)
> it is best to make the SDR maximum luminance value configurable.
> 
> **We propose a drm_plane property to specfy the 

Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes

2021-04-27 Thread Daniel Vetter
On Mon, Apr 26, 2021 at 01:38:49PM -0400, Harry Wentland wrote:
> 
> ## Introduction
> 
> We are looking to enable HDR support for a couple of single-plane and
> multi-plane scenarios. To do this effectively we recommend new
> interfaces to drm_plane. Below I'll give a bit of background on HDR and
> why we propose these interfaces.

I think this is on of the topics that would tremendously benefit from the
uapi rfc process, with lots of compositor people involved.

https://dri.freedesktop.org/docs/drm/gpu/rfc/

Also for this I think we really do need a pretty solid understanding of
the involve compositor protocols, otherwise the kernel uapi is going to be
for naught.
-Daniel

> 
> 
> ## Defining a pixel's luminance
> 
> Currently the luminance space of pixels in a framebuffer/plane presented to 
> the display is not well defined. It's usually assumed to be in a 2.2 or 2.4 
> gamma space and has no mapping to an absolute luminance value but is 
> interpreted in relative terms.
> 
> Luminance can be measured and described in absolute terms as candela per 
> meter squared, or cd/m2, or nits. Even though a pixel value can be mapped to 
> luminance in a linear fashion to do so without losing a lot of detail 
> requires 16-bpc color depth. The reason for this is that human perception can 
> distinguish roughly between a 0.5-1% luminance delta. A linear representation 
> is suboptimal, wasting precision in the highlights and losing precision in 
> the shadows.
> 
> A gamma curve is a decent approximation to a human's perception of luminance, 
> but the PQ (perceptual quantizer) function [1] improves on it. It also 
> defines the luminance values in absolute terms, with the highest value being 
> 10,000 nits and the lowest 0.0005 nits.
> 
> Using a content that's defined in PQ space we can approximate the real world 
> in a much better way.
> 
> Here are some examples of real-life objects and their approximate luminance 
> values:
> 
> | Object| Luminance in nits |
> | - | - |
> | Sun   | 1.6 million   |
> | Fluorescent light | 10,000|
> | Highlights| 1,000 - sunlight  |
> | White Objects | 250 - 1,000   |
> | Typical objects   | 1 - 250   |
> | Shadows   | 0.01 - 1  |
> | Ultra Blacks  | 0 - 0.0005|
> 
> 
> ## Describing the luminance space
> 
> **We propose a new drm_plane property to describe the Eletro-Optical Transfer 
> Function (EOTF) with which its framebuffer was composed.** Examples of EOTF 
> are:
> 
> | EOTF  | Description 
>   |
> | - 
> |:- |
> | Gamma 2.2 | a simple 2.2 gamma  
>   |
> | sRGB  | 2.4 gamma with small initial linear section 
>   |
> | PQ 2084   | SMPTE ST 2084; used for HDR video and allows for up to 10,000 
> nit support |
> | Linear| Linear relationship between pixel value and luminance value 
>   |
> 
> 
> ## Mastering Luminances
> 
> Now we are able to use the PQ 2084 EOTF to define the luminance of pixels in 
> absolute terms. Unfortunately we're again presented with physical limitations 
> of the display technologies on the market today. Here are a few examples of 
> luminance ranges of displays.
> 
> | Display  | Luminance range in nits |
> |  | --- |
> | Typical PC display   | 0.3 - 200   |
> | Excellent LCD HDTV   | 0.3 - 400   |
> | HDR LCD w/ local dimming | 0.05 - 1,500|
> 
> Since no display can currently show the full 0.0005 to 10,000 nits luminance 
> range the display will need to tonemap the HDR content, i.e to fit the 
> content within a display's capabilities. To assist with tonemapping HDR 
> content is usually accompanied with a metadata that describes (among other 
> things) the minimum and maximum mastering luminance, i.e. the maximum and 
> minimum luminance of the display that was used to master the HDR content.
> 
> The HDR metadata is currently defined on the drm_connector via the 
> hdr_output_metadata blob property.
> 
> It might be useful to define per-plane hdr metadata, as different planes 
> might have been mastered differently.
> 
> 
> ## SDR Luminance
> 
> Since SDR covers a smaller luminance range than HDR, an SDR plane might look 
> dark when blended with HDR content. Since the max HDR luminance can be quite 
> variable (200-1,500 nits on actual displays) it is best to make the SDR 
> maximum luminance value configurable.
> 
> **We propose a drm_plane property to specfy the desired maximum luminance of 
> the SDR plane in nits.** This allows us to map the SDR content predictably 
> into HDR's absolute luminance space.
> 
> 
> ## Let There Be Color
> 
> So far we've only talked about