Hi Andy,
I have seen some funky effects recently on Radeon 5970 4890. I was not
able to install 10.6 on Windows 7 64 bit on 5970. Driver kept crashing with
classic BSOD. But texture corruption you mention was present in 10.5 on
Radeon 4890 as well. In both cases we switched back to Catalyst 10.4. We
have read on the forums that ATI started to experiment with GL 3.3 4.0
recently and broke OpenGL 2.0 compatibility in latest drivers.
My other observations on Catalyst 10.4 are related to Texture Matrix default
not being identity in the shaders. Light attenuation factors seem to be
incorrect in shaders as well. Frankly I don't have much experience with
testing on Radeons, I ony have recently made few experiments. I could not
count them as successful. Good thing about Windows 7 is one can have both
NVidia ATI installed together. Card that displays main screen renders all
OpenGL so usually after some frustrating tests on ATI, I quickly switch main
monitor to NVidia ;-).
Cheers,
Wojtek
--
From: Andy Peruggi aperu...@ara.com
Sent: Tuesday, June 29, 2010 8:02 PM
To: osg-users@lists.openscenegraph.org
Subject: [osg-users] Graphical issues with display lists on ATI 5870 cards
Hi everyone,
My company has been developing products using OSG for a few years now, and
recently we've noticed some rendering issues on two machines that are both
running ATI 5870 cards using the latest ATI drivers (Catalyst 10.6). We
wanted to know if anyone running similar hardware has seen these issues.
We are using the 2.8.2 release of OSG.
We notice two major graphical problems:
* Incorrect lighting on primitives - flat geometry with equal vertex
normals at all vertices appears to have incorrect lighting over the
surface (using the default osgviewer lighting)
* Corrupted textures - either a single pixel color over the entire surface
or it looks like the UV coordinates were run through a blender
We have tested rendering using the OSGViewer with a simple .osg file
containing two triangles (forming a quad) with identical per-vertex
normals and simple per-vertex UVs. What we have found is that we either
have to disable display lists on the geometry in order to have the quad
render correctly, or else we seem to be able to get correct results by
adding additional (bogus) values to the per-vertex normal and UV lists to
pad them. We believe this means that somewhere between OSG and the GPU the
per-vertex data in the display list is getting corrupted and normals/UVs
are getting lost.
We have not experienced this issue in any other 3D apps on this hardware,
including ones that use display lists. We also do not have the rendering
issues with our app on other hardware (tested using several nVidia cards
and and older ATI card). We think this may be an ATI driver bug, but we're
not sure at this point and would like to hear if anyone else has run into
these issues.
Thanks.
- Andy Peruggi
Applied Research Associates, Inc[/list]
--
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=29501#29501
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org