Hi Tim,
I have just checked in the following workaround:
const GLubyte* renderer = glGetString(GL_RENDERER);
std::string rendererString = renderer ? (const char*)renderer : "";
if (rendererString.find("Radeon")!=std::string::npos ||
rendererString.find("RADEON")!=std::string::npos)
{
// AMD/ATI drivers are producing an invalid enumerate error on the
// glGetQueryiv(GL_TIMESTAMP, GL_QUERY_COUNTER_BITS_ARB, &bits);
// call so work around it by assuming 64 bits for counter.
setTimestampBits(64);
//setTimestampBits(0);
}
else
{
GLint bits = 0;
extensions->glGetQueryiv(GL_TIMESTAMP,
GL_QUERY_COUNTER_BITS_ARB, &bits);
setTimestampBits(bits);
}
I've tested this out on my ATI 4670 + Kubnutu 10.10 system and it
kinda works, but not too well. Sometimes I get the orange bars
appearing but typically quite a few frames in, and the bars often
appear and then disappear, sometimes never to appear again. I've
experimented with using adding a glFinish() to
osg::State::frameCompleted() and this does stabilize things but it's
still not 100% reliable.
Any thoughts on why the timing doesn't seem reliable? It might simply
be the the GL_TIMESTAMP code in the driver is flaky and just needs to
be disabled completely for ATI cards. I guess there is chance that
you see the instability under NVidia as well.
Could you do an svn update and test the new code out to make sure
there hasn't been a regression under NVidia.
Cheers,
Robert.
Robert.
_______________________________________________
osg-submissions mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-submissions-openscenegraph.org