https://bugs.freedesktop.org/show_bug.cgi?id=96684
GitLab Migration User changed:
What|Removed |Added
Status|NEW |RESOLVED
Resolution|---
https://bugs.freedesktop.org/show_bug.cgi?id=96684
Timothy Arceri changed:
What|Removed |Added
Component|Mesa core |Drivers/DRI/swrast
--
You are receivin
https://bugs.freedesktop.org/show_bug.cgi?id=96684
--- Comment #4 from Timothy Arceri ---
I've update the test to have the same outcome regardless of which branch is
taken.
https://patchwork.freedesktop.org/patch/147474/
This should fix the problem.
--
You are receiving this mail because:
You
https://bugs.freedesktop.org/show_bug.cgi?id=96684
--- Comment #3 from Timothy Arceri ---
I think the test is wrong, it should not be expecting a specific outcome.
The spec says:
"Behavior is undefined if a shader subscripts an array with an index less than
0 or greater than or equal to the siz
https://bugs.freedesktop.org/show_bug.cgi?id=96684
--- Comment #2 from Emil Velikov ---
I can reproduce this here so gave it a quick look:
The GLSL IR is identical with _and_ w/o the optimisations (MESA_GLSL=nopt), yet
the result changes to pass. Seems like the _mesa_optimize_program invocation
https://bugs.freedesktop.org/show_bug.cgi?id=96684
Vinson Lee changed:
What|Removed |Added
Blocks||98471
Referenced Bugs:
https://bugs.freed
https://bugs.freedesktop.org/show_bug.cgi?id=96684
Vinson Lee changed:
What|Removed |Added
Version|git |12.0
--
You are receiving this mail becaus
https://bugs.freedesktop.org/show_bug.cgi?id=96684
--- Comment #1 from Kenneth Graunke ---
I can't reproduce this. I tried both classic swrast and llvmpipe and it seems
to be working fine.
--
You are receiving this mail because:
You are the assignee for the bug.
You are the QA Contact for the
https://bugs.freedesktop.org/show_bug.cgi?id=96684
Bug ID: 96684
Summary: [swrast] piglit glsl-array-bounds-01 regression
Product: Mesa
Version: git
Hardware: x86-64 (AMD64)
OS: Linux (All)
Status: NEW
Key