https://bugs.freedesktop.org/show_bug.cgi?id=108321

            Bug ID: 108321
           Summary: rv610: corrupt shader output with SB if_conversion
                    optimization pass
           Product: Mesa
           Version: git
          Hardware: x86-64 (AMD64)
                OS: Linux (All)
            Status: NEW
          Severity: normal
          Priority: medium
         Component: Drivers/Gallium/r600
          Assignee: dri-devel@lists.freedesktop.org
          Reporter: nicholasbis...@gmail.com
        QA Contact: dri-devel@lists.freedesktop.org

Created attachment 141984
  --> https://bugs.freedesktop.org/attachment.cgi?id=141984&action=edit
rv610 sbdump

I'm debugging some graphics corruption in Chromium on an iMac7,1 with rv610
graphics. Basically an area of the application has randomish blocks of color,
like it's reading from an uninitialized texture.

I've narrowed the problem down to a fragment shader:

    varying mediump vec2 _uv_texCoord;
    uniform lowp sampler2D _ulut_texture;
    uniform mediump float _ulut_size;

    mediump vec4 _uLUT(in lowp sampler2D _usampler, in mediump vec3 _upos, in
mediump float _usize){
      (_upos *= (_usize - 1.0));
      mediump float _ulayer = min(floor(_upos.z), (_usize - 2.0));
      (_upos.xy = ((_upos.xy + vec2(0.51234001, 0.51234001)) / _usize));
      (_upos.y = ((_upos.y + _ulayer) / _usize));
      return mix(texture2D(_usampler, _upos.xy), texture2D(_usampler, (_upos.xy
+ vec2(0, (1.0 / _usize)))), (_upos.z - _ulayer));
    }

    void main(){
      mediump vec2 _utexCoord = _uv_texCoord;
      mediump vec4 _utexColor = texture2D(_us_texture, _utexCoord);
      if ((_utexColor.w > 0.0))
      {
        (_utexColor.xyz /= _utexColor.w);
      }
      (_utexColor.xyz = _uLUT(_ulut_texture, _utexColor.xyz, _ulut_size).xyz);
      (_utexColor.xyz *= _utexColor.w);
      (gl_FragColor = vec4(_utexColor.xyz, 1.0));
    }

(Note that in the original shader "0.51234001" is just "0.5", I just changed it
to make grepping easier.) I played around with changes to the _uLUT function,
and I found that any mixing of the two samples seems to trigger the bug, e.g. I
can mix with a constant 0.5 and the bug still happens. I also tried replacing
"mix" with an explicit a*(x-1)+b*x; no change.

I found that disabling SB fixed the issue, and in particular just commenting
out the "if_conversion" SB pass fixes the issue.

I've attached the full sbdump output. Please let me know if there are more
details I can provide that would help.

-- 
You are receiving this mail because:
You are the assignee for the bug.
_______________________________________________
dri-devel mailing list
dri-devel@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/dri-devel

Reply via email to