On 01/26/2015 02:44 PM, Ian Romanick wrote:
On 01/26/2015 08:30 AM, Brian Paul wrote:
To avoid a compile failure on NVIDIA.  The mix function parameter is
hiding the built-in mix() function.  The point of this test is not
to exercise compile-time name resolution.

Is NVIDIA's behavior correct?

I don't know. I haven't had time to dig through the spec to get a clue and haven't tried any test programs.


We should add a test to check for the
correct behavior (whichever way that is).

Either way, this change is

Reviewed-by: <[email protected]>

Thanks.

-Brian


---
  tests/shaders/glsl-fs-functions-samplers.shader_test | 4 ++--
  1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/tests/shaders/glsl-fs-functions-samplers.shader_test 
b/tests/shaders/glsl-fs-functions-samplers.shader_test
index 2273bdd..c4fd69a 100644
--- a/tests/shaders/glsl-fs-functions-samplers.shader_test
+++ b/tests/shaders/glsl-fs-functions-samplers.shader_test
@@ -18,9 +18,9 @@ varying vec2 tc2;
  uniform sampler2D tex1;
  uniform sampler2D tex2;

-vec4 blend_textures(sampler2D t1, vec2 tc1, sampler2D t2, vec2 tc2, float mix)
+vec4 blend_textures(sampler2D t1, vec2 tc1, sampler2D t2, vec2 tc2, float 
mixFactor)
  {
-       return mix(texture2D(t1, tc1), texture2D(t2, tc2), mix);
+       return mix(texture2D(t1, tc1), texture2D(t2, tc2), mixFactor);
  }

  void main()



_______________________________________________
Piglit mailing list
[email protected]
http://lists.freedesktop.org/mailman/listinfo/piglit

Reply via email to