Re: [Intel-gfx] [PATCH] Remove illogical/bogus Automatic mode from Broadcast RGB property
On Tue, Apr 14, 2015 at 02:22:53PM +0800, Tom Yan wrote: It's not about whether it follows every line of the spec, but whether it makes sense to follow. But I'm not going to argue about this anymore, I've written enough to show what's the logical error, yet seems I am the only one in the world who sees a problem in switching color range according to resolution and refresh rate, which I don't see any hardware vendor would/should follow either. So if you don't see a problem here, do whatever you like. You assume no one follows the spec, but all I have to do is look at the TV next to me and see that you're wrong. It's extremely unfortunate that displays can apparently get the appropriate logo without following the spec. I can only conclude that whatever conformance testing is done is insufficient. One option I was thinking would be to find some other hint in the EDID that could tell us whether the device is likely to be spec compliant, but sadly I've not found anything that would appear to work. On 13 April 2015 at 22:22, Ville Syrjälä ville.syrj...@linux.intel.com wrote: On Mon, Apr 13, 2015 at 03:13:12PM +0100, Damien Lespiau wrote: On Wed, Apr 08, 2015 at 07:18:06PM +0800, Tom Yan wrote: https://bugzilla.kernel.org/show_bug.cgi?id=94921 As mentioned in the above bug report, switching output color range Automatically according to current mode does not make sense in computer use case. Current code seems correct to me after re-reading CEA-861-E again. However can we do better? Maybe! From the spec: The QS (AVI Q support) bit of byte 3 allows a display to declare that it supports the reception of either type of quantization range for any video format, under the direction of InfoFrame Q data (see Section 6.4 for information concerning bits Q1 and Q0). This allows a source to override the default quantization range for any video format. If the sink declares a selectable RGB Quantization Range (QS=1) then it shall expect limited range pixel values if it receives Q=1 and it shall expect full range pixel values if it receives Q=2 (see section 6.4). For other values of Q, the sink shall expect pixel values with the default range for the transmitted video format. So, for sinks that support it, we could default to sending the full range picture and overriding the quantization bit in the AVI infoframe. You could you try to run edid-decode [1] on your sink EDID to check if it supports overriding the quantization level (I added decoding the VCDB a while back). Ville, what do you think? Sure, if you can actually find a display that supports the Q bit. I've not seen one yet :( They should have just made it mandatory, otherwise I fear it's never going to catch on. -- Ville Syrjälä Intel OTC -- Ville Syrjälä Intel OTC ___ Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
Re: [Intel-gfx] [PATCH] Remove illogical/bogus Automatic mode from Broadcast RGB property
It's not about whether it follows every line of the spec, but whether it makes sense to follow. But I'm not going to argue about this anymore, I've written enough to show what's the logical error, yet seems I am the only one in the world who sees a problem in switching color range according to resolution and refresh rate, which I don't see any hardware vendor would/should follow either. So if you don't see a problem here, do whatever you like. On 13 April 2015 at 22:22, Ville Syrjälä ville.syrj...@linux.intel.com wrote: On Mon, Apr 13, 2015 at 03:13:12PM +0100, Damien Lespiau wrote: On Wed, Apr 08, 2015 at 07:18:06PM +0800, Tom Yan wrote: https://bugzilla.kernel.org/show_bug.cgi?id=94921 As mentioned in the above bug report, switching output color range Automatically according to current mode does not make sense in computer use case. Current code seems correct to me after re-reading CEA-861-E again. However can we do better? Maybe! From the spec: The QS (AVI Q support) bit of byte 3 allows a display to declare that it supports the reception of either type of quantization range for any video format, under the direction of InfoFrame Q data (see Section 6.4 for information concerning bits Q1 and Q0). This allows a source to override the default quantization range for any video format. If the sink declares a selectable RGB Quantization Range (QS=1) then it shall expect limited range pixel values if it receives Q=1 and it shall expect full range pixel values if it receives Q=2 (see section 6.4). For other values of Q, the sink shall expect pixel values with the default range for the transmitted video format. So, for sinks that support it, we could default to sending the full range picture and overriding the quantization bit in the AVI infoframe. You could you try to run edid-decode [1] on your sink EDID to check if it supports overriding the quantization level (I added decoding the VCDB a while back). Ville, what do you think? Sure, if you can actually find a display that supports the Q bit. I've not seen one yet :( They should have just made it mandatory, otherwise I fear it's never going to catch on. -- Ville Syrjälä Intel OTC ___ Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
Re: [Intel-gfx] [PATCH] Remove illogical/bogus Automatic mode from Broadcast RGB property
Are you sure it can only handle limited range in CEA modes and only full range in non CEA modes? Or is it the fact that it doesn't handle full range at all and its (or your) prefer mode happens to be a CEA mode? Or the whole series of this model only supports full range and the model you got happen to have a non CEA prefer mode, while the other model in the same series with CEA prefer mode would suffer anyway? Yes I can't tell how every vendor made their hardware, but it's silly for them to follow this spec and it's even sillier for you to make a driver specifically for those vendors. Because for example, if a user want to buy a monitor for photo editing, he must carefully avoid the ones with a prefer cea mode (or DP/HDMI ports). If one day cea declare that every mode belongs to them, the user will have nowhere to go. For a device with HDMI port, it might SEEM to work anyway because of convention. But for one with only DP (and DVI/VGA), which means it's literally a pure PC monitor, I wonder how many of them have ANY support to limited range at all. (Neither should they, think about what a PC monitor is targeted at.) The EDID does have something can imply whether a device supports limited range, which is its capabilities of YCbCr color space(s), which are inheritly of limited range. P.S. In the case of NVIDIA blob, they locks the output range to limited if it reads the HDMI vendor number in the EDID (while a switch is available for VDPAU only which acts in a weird reverse way), while keeping DP untouched. This is not a very good reference but it seems necessary for the YCbCr output in their implementation, yet even this makes more sense to the mode/range mechanism. On 14 April 2015 at 16:38, Ville Syrjälä ville.syrj...@linux.intel.com wrote: On Tue, Apr 14, 2015 at 02:22:53PM +0800, Tom Yan wrote: It's not about whether it follows every line of the spec, but whether it makes sense to follow. But I'm not going to argue about this anymore, I've written enough to show what's the logical error, yet seems I am the only one in the world who sees a problem in switching color range according to resolution and refresh rate, which I don't see any hardware vendor would/should follow either. So if you don't see a problem here, do whatever you like. You assume no one follows the spec, but all I have to do is look at the TV next to me and see that you're wrong. It's extremely unfortunate that displays can apparently get the appropriate logo without following the spec. I can only conclude that whatever conformance testing is done is insufficient. One option I was thinking would be to find some other hint in the EDID that could tell us whether the device is likely to be spec compliant, but sadly I've not found anything that would appear to work. On 13 April 2015 at 22:22, Ville Syrjälä ville.syrj...@linux.intel.com wrote: On Mon, Apr 13, 2015 at 03:13:12PM +0100, Damien Lespiau wrote: On Wed, Apr 08, 2015 at 07:18:06PM +0800, Tom Yan wrote: https://bugzilla.kernel.org/show_bug.cgi?id=94921 As mentioned in the above bug report, switching output color range Automatically according to current mode does not make sense in computer use case. Current code seems correct to me after re-reading CEA-861-E again. However can we do better? Maybe! From the spec: The QS (AVI Q support) bit of byte 3 allows a display to declare that it supports the reception of either type of quantization range for any video format, under the direction of InfoFrame Q data (see Section 6.4 for information concerning bits Q1 and Q0). This allows a source to override the default quantization range for any video format. If the sink declares a selectable RGB Quantization Range (QS=1) then it shall expect limited range pixel values if it receives Q=1 and it shall expect full range pixel values if it receives Q=2 (see section 6.4). For other values of Q, the sink shall expect pixel values with the default range for the transmitted video format. So, for sinks that support it, we could default to sending the full range picture and overriding the quantization bit in the AVI infoframe. You could you try to run edid-decode [1] on your sink EDID to check if it supports overriding the quantization level (I added decoding the VCDB a while back). Ville, what do you think? Sure, if you can actually find a display that supports the Q bit. I've not seen one yet :( They should have just made it mandatory, otherwise I fear it's never going to catch on. -- Ville Syrjälä Intel OTC -- Ville Syrjälä Intel OTC ___ Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
Re: [Intel-gfx] [PATCH] Remove illogical/bogus Automatic mode from Broadcast RGB property
I feel silly that I write another mail about this, but I really want who ever read this mail list would have a chance to at least think about why does the CEA modes-limited range mechanism ever exist and where should it (not) apply. For those who feel thrown away from his comfort zone when told a spec is non-sense. (Yes, it probably make sense SOMEwhere) First, what are CEA modes? Why are certain modes considered special by the Consumer Electronic Association? Compare the two lists and you should understand: http://lxr.free-electrons.com/source/drivers/gpu/drm/drm_edid.c#L606 http://en.wikipedia.org/wiki/Blu-ray_Disc#Video Yes, these modes are standard in video discs like Blu-ray or DVD (or TV broadcasting). Okay, so why the spec said these mode should be use with limited color range? There's another reason I compare the mode list with the table in the wiki's section of Blu-ray Video -- because video discs / files are encoded in limited color range. (Google the reason yourself :P) Then why didn't the spec require limited color range to be used for every possible modes exists in the world? A possbile reason is, they want to make sure av products (i think tv, a sink, might be the only possible case) have a rule to follow if the vendor want it to support both full and limited range but without introducing a setting that requires the user to understand this whole thing. That might be the reason why someone claim that their device follow this rule. Yes it might be true, because the vendors (of a TV) might want to give their user best user experience by keeping their user ignorant, at the same time stopping user who need full range with CEA modes to pick it himself. (Time to think: should we follow these vendors then?) So even if the hardware support both range, there is no option to change manually. But the thing is this rule PRIORITIZE video disc/file playback (assume it is the case whenever the source is in CEA modes, hence use limited color range), and use full range for other case (because limited range is rarely used/necessary/appropriate for other cases). So what's the problem if we write a graphic card driver following this rule for HDMI and DP? By following this rule, it means the graphic card (pc) is PRIORITZED to video disc/file playback, while leaving other jobs as second-class citizens. Remember, your PC is not a Blu-ray Player even if you mainly use it for video playback, not everyone use it like this. Worsestill, video playback software might not actually deal with limited range output appropriately because they might have assume PC monitor supports full range only. So even if your monitor or TV support it, you might still experience a Limited(Source File)-Full(Interpolated by the player)-Limited(Compressed again by the gfx driver) conversion. For the case of HDMI, at least for the TVs, following the rule might not cause a problem because they are mostly expected to deal with source like Blu-ray/DVD players, which is totally acceptable if they could only output limited range. The only weakness would be making you to suffer the lossy-compressed limited color range. But if some HDMI PC monitors does not assume it should be used with source the those players, they might fail to support limited range. Yet I wouldn't call this a problem of them because if it's made as a PC monitor, it's fine that they support only full range, which is used for general purpose. The worst case is to follow the rule with DisplayPort. Sorry I can't help this time. I can only say that VESA is stupid, or actually greedy to adopt this rule. The only reason I can think of for them to follow this rule is, at the beginning (or still) they think that DisplayPort is gonna replace HDMI and be the only connector in the PC and AV worlds, so they try to have as many as possible the same basic properties as HDMI, so that not to upset any AV product vendors. But the thing is, nobody shows VESA a damn about this. If you see a Blu-ray Player, or even a TV with DisplayPort, don't forget to take a picture (and send me please I beg you). And in the PC world, no body wants limited range when you do photo editing, gaming or even web browsing and coding. Why? It's simple substraction. So in the end, with the help of some graphic card vendors, you and I are successfully pissed by VESA's greed if we got a DP monitor which happen have a CEA prefer mode. On 14 April 2015 at 19:35, Tom Yan tom.t...@gmail.com wrote: By the way it's extremely FORTUNATE that many vendors does not follow this silly rule from CEA, because doing so means nothing else than letting CEA to make certains mode proprietary. It is of course ideal to make devices support both range. But when the cost is considered, the only sane compromise would be picking one which is suitable for the targeted area (av playback only/general pc usage), not to follow that stupid rule like making this support limited range only:
Re: [Intel-gfx] [PATCH] Remove illogical/bogus Automatic mode from Broadcast RGB property
By the way it's extremely FORTUNATE that many vendors does not follow this silly rule from CEA, because doing so means nothing else than letting CEA to make certains mode proprietary. It is of course ideal to make devices support both range. But when the cost is considered, the only sane compromise would be picking one which is suitable for the targeted area (av playback only/general pc usage), not to follow that stupid rule like making this support limited range only: http://www.eizoglobal.com/products/flexscan/ev2336w/index.html and this support full range only: http://www.eizoglobal.com/products/flexscan/ev2436w/index.html while both is obviously PC monitors and belongs to the same series. (Luckily in reality eizo DIDN'T follow that rule) On 14 April 2015 at 19:18, Tom Yan tom.t...@gmail.com wrote: Are you sure it can only handle limited range in CEA modes and only full range in non CEA modes? Or is it the fact that it doesn't handle full range at all and its (or your) prefer mode happens to be a CEA mode? Or the whole series of this model only supports full range and the model you got happen to have a non CEA prefer mode, while the other model in the same series with CEA prefer mode would suffer anyway? Yes I can't tell how every vendor made their hardware, but it's silly for them to follow this spec and it's even sillier for you to make a driver specifically for those vendors. Because for example, if a user want to buy a monitor for photo editing, he must carefully avoid the ones with a prefer cea mode (or DP/HDMI ports). If one day cea declare that every mode belongs to them, the user will have nowhere to go. For a device with HDMI port, it might SEEM to work anyway because of convention. But for one with only DP (and DVI/VGA), which means it's literally a pure PC monitor, I wonder how many of them have ANY support to limited range at all. (Neither should they, think about what a PC monitor is targeted at.) The EDID does have something can imply whether a device supports limited range, which is its capabilities of YCbCr color space(s), which are inheritly of limited range. P.S. In the case of NVIDIA blob, they locks the output range to limited if it reads the HDMI vendor number in the EDID (while a switch is available for VDPAU only which acts in a weird reverse way), while keeping DP untouched. This is not a very good reference but it seems necessary for the YCbCr output in their implementation, yet even this makes more sense to the mode/range mechanism. On 14 April 2015 at 16:38, Ville Syrjälä ville.syrj...@linux.intel.com wrote: On Tue, Apr 14, 2015 at 02:22:53PM +0800, Tom Yan wrote: It's not about whether it follows every line of the spec, but whether it makes sense to follow. But I'm not going to argue about this anymore, I've written enough to show what's the logical error, yet seems I am the only one in the world who sees a problem in switching color range according to resolution and refresh rate, which I don't see any hardware vendor would/should follow either. So if you don't see a problem here, do whatever you like. You assume no one follows the spec, but all I have to do is look at the TV next to me and see that you're wrong. It's extremely unfortunate that displays can apparently get the appropriate logo without following the spec. I can only conclude that whatever conformance testing is done is insufficient. One option I was thinking would be to find some other hint in the EDID that could tell us whether the device is likely to be spec compliant, but sadly I've not found anything that would appear to work. On 13 April 2015 at 22:22, Ville Syrjälä ville.syrj...@linux.intel.com wrote: On Mon, Apr 13, 2015 at 03:13:12PM +0100, Damien Lespiau wrote: On Wed, Apr 08, 2015 at 07:18:06PM +0800, Tom Yan wrote: https://bugzilla.kernel.org/show_bug.cgi?id=94921 As mentioned in the above bug report, switching output color range Automatically according to current mode does not make sense in computer use case. Current code seems correct to me after re-reading CEA-861-E again. However can we do better? Maybe! From the spec: The QS (AVI Q support) bit of byte 3 allows a display to declare that it supports the reception of either type of quantization range for any video format, under the direction of InfoFrame Q data (see Section 6.4 for information concerning bits Q1 and Q0). This allows a source to override the default quantization range for any video format. If the sink declares a selectable RGB Quantization Range (QS=1) then it shall expect limited range pixel values if it receives Q=1 and it shall expect full range pixel values if it receives Q=2 (see section 6.4). For other values of Q, the sink shall expect pixel values with the default range for the transmitted video format. So, for sinks that support it, we could default to sending the full range picture and
[Intel-gfx] [PATCH] Remove illogical/bogus Automatic mode from Broadcast RGB property
https://bugzilla.kernel.org/show_bug.cgi?id=94921 As mentioned in the above bug report, switching output color range Automatically according to current mode does not make sense in computer use case. --- drivers/gpu/drm/i915/i915_drv.h| 5 ++--- drivers/gpu/drm/i915/intel_dp.c| 23 ++- drivers/gpu/drm/i915/intel_hdmi.c | 20 ++-- drivers/gpu/drm/i915/intel_modes.c | 1 - drivers/gpu/drm/i915/intel_sdvo.c | 29 + 5 files changed, 11 insertions(+), 67 deletions(-) diff --git a/drivers/gpu/drm/i915/i915_drv.h b/drivers/gpu/drm/i915/i915_drv.h index 8727086..2385732 100644 --- a/drivers/gpu/drm/i915/i915_drv.h +++ b/drivers/gpu/drm/i915/i915_drv.h @@ -3287,9 +3287,8 @@ int intel_freq_opcode(struct drm_i915_private *dev_priv, int val); #define POSTING_READ16(reg)(void)I915_READ16_NOTRACE(reg) /* Broadcast RGB property */ -#define INTEL_BROADCAST_RGB_AUTO 0 -#define INTEL_BROADCAST_RGB_FULL 1 -#define INTEL_BROADCAST_RGB_LIMITED 2 +#define INTEL_BROADCAST_RGB_FULL 0 +#define INTEL_BROADCAST_RGB_LIMITED 1 static inline uint32_t i915_vgacntrl_reg(struct drm_device *dev) { diff --git a/drivers/gpu/drm/i915/intel_dp.c b/drivers/gpu/drm/i915/intel_dp.c index a74aaf9..a9ed631 100644 --- a/drivers/gpu/drm/i915/intel_dp.c +++ b/drivers/gpu/drm/i915/intel_dp.c @@ -1238,18 +1238,6 @@ intel_dp_compute_config(struct intel_encoder *encoder, return false; found: - if (intel_dp-color_range_auto) { - /* -* See: -* CEA-861-E - 5.1 Default Encoding Parameters -* VESA DisplayPort Ver.1.2a - 5.1.1.1 Video Colorimetry -*/ - if (bpp != 18 drm_match_cea_mode(adjusted_mode) 1) - intel_dp-color_range = DP_COLOR_RANGE_16_235; - else - intel_dp-color_range = 0; - } - if (intel_dp-color_range) pipe_config-limited_color_range = true; @@ -4220,27 +4208,20 @@ intel_dp_set_property(struct drm_connector *connector, } if (property == dev_priv-broadcast_rgb_property) { - bool old_auto = intel_dp-color_range_auto; uint32_t old_range = intel_dp-color_range; switch (val) { - case INTEL_BROADCAST_RGB_AUTO: - intel_dp-color_range_auto = true; - break; case INTEL_BROADCAST_RGB_FULL: - intel_dp-color_range_auto = false; intel_dp-color_range = 0; break; case INTEL_BROADCAST_RGB_LIMITED: - intel_dp-color_range_auto = false; intel_dp-color_range = DP_COLOR_RANGE_16_235; break; default: return -EINVAL; } - if (old_auto == intel_dp-color_range_auto - old_range == intel_dp-color_range) + if (old_range == intel_dp-color_range) return 0; goto done; @@ -4548,7 +4529,7 @@ intel_dp_add_properties(struct intel_dp *intel_dp, struct drm_connector *connect intel_attach_force_audio_property(connector); intel_attach_broadcast_rgb_property(connector); - intel_dp-color_range_auto = true; + intel_dp-color_range = 0; if (is_edp(intel_dp)) { drm_mode_create_scaling_mode_property(connector-dev); diff --git a/drivers/gpu/drm/i915/intel_hdmi.c b/drivers/gpu/drm/i915/intel_hdmi.c index 995c5b2..4fff5c6 100644 --- a/drivers/gpu/drm/i915/intel_hdmi.c +++ b/drivers/gpu/drm/i915/intel_hdmi.c @@ -990,15 +990,6 @@ bool intel_hdmi_compute_config(struct intel_encoder *encoder, if (pipe_config-has_hdmi_sink) pipe_config-has_infoframe = true; - if (intel_hdmi-color_range_auto) { - /* See CEA-861-E - 5.1 Default Encoding Parameters */ - if (pipe_config-has_hdmi_sink - drm_match_cea_mode(adjusted_mode) 1) - intel_hdmi-color_range = HDMI_COLOR_RANGE_16_235; - else - intel_hdmi-color_range = 0; - } - if (adjusted_mode-flags DRM_MODE_FLAG_DBLCLK) { pipe_config-pixel_multiplier = 2; } @@ -1196,27 +1187,20 @@ intel_hdmi_set_property(struct drm_connector *connector, } if (property == dev_priv-broadcast_rgb_property) { - bool old_auto = intel_hdmi-color_range_auto; uint32_t old_range = intel_hdmi-color_range; switch (val) { - case INTEL_BROADCAST_RGB_AUTO: - intel_hdmi-color_range_auto = true; - break; case INTEL_BROADCAST_RGB_FULL: - intel_hdmi-color_range_auto = false;
Re: [Intel-gfx] [PATCH] Remove illogical/bogus Automatic mode from Broadcast RGB property
On Wed, Apr 08, 2015 at 07:18:06PM +0800, Tom Yan wrote: https://bugzilla.kernel.org/show_bug.cgi?id=94921 As mentioned in the above bug report, switching output color range Automatically according to current mode does not make sense in computer use case. Current code seems correct to me after re-reading CEA-861-E again. However can we do better? Maybe! From the spec: The QS (AVI Q support) bit of byte 3 allows a display to declare that it supports the reception of either type of quantization range for any video format, under the direction of InfoFrame Q data (see Section 6.4 for information concerning bits Q1 and Q0). This allows a source to override the default quantization range for any video format. If the sink declares a selectable RGB Quantization Range (QS=1) then it shall expect limited range pixel values if it receives Q=1 and it shall expect full range pixel values if it receives Q=2 (see section 6.4). For other values of Q, the sink shall expect pixel values with the default range for the transmitted video format. So, for sinks that support it, we could default to sending the full range picture and overriding the quantization bit in the AVI infoframe. You could you try to run edid-decode [1] on your sink EDID to check if it supports overriding the quantization level (I added decoding the VCDB a while back). Ville, what do you think? -- Damien [1] http://cgit.freedesktop.org/xorg/app/edid-decode ___ Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx
Re: [Intel-gfx] [PATCH] Remove illogical/bogus Automatic mode from Broadcast RGB property
On Mon, Apr 13, 2015 at 03:13:12PM +0100, Damien Lespiau wrote: On Wed, Apr 08, 2015 at 07:18:06PM +0800, Tom Yan wrote: https://bugzilla.kernel.org/show_bug.cgi?id=94921 As mentioned in the above bug report, switching output color range Automatically according to current mode does not make sense in computer use case. Current code seems correct to me after re-reading CEA-861-E again. However can we do better? Maybe! From the spec: The QS (AVI Q support) bit of byte 3 allows a display to declare that it supports the reception of either type of quantization range for any video format, under the direction of InfoFrame Q data (see Section 6.4 for information concerning bits Q1 and Q0). This allows a source to override the default quantization range for any video format. If the sink declares a selectable RGB Quantization Range (QS=1) then it shall expect limited range pixel values if it receives Q=1 and it shall expect full range pixel values if it receives Q=2 (see section 6.4). For other values of Q, the sink shall expect pixel values with the default range for the transmitted video format. So, for sinks that support it, we could default to sending the full range picture and overriding the quantization bit in the AVI infoframe. You could you try to run edid-decode [1] on your sink EDID to check if it supports overriding the quantization level (I added decoding the VCDB a while back). Ville, what do you think? Sure, if you can actually find a display that supports the Q bit. I've not seen one yet :( They should have just made it mandatory, otherwise I fear it's never going to catch on. -- Ville Syrjälä Intel OTC ___ Intel-gfx mailing list Intel-gfx@lists.freedesktop.org http://lists.freedesktop.org/mailman/listinfo/intel-gfx