Well, my mistake, I didn't make myself clear. I , of course, have vsync set as
"Use application settings" in drivers (nVidia Control Panel). It's by the way a
default value. But I can't seem to make my application to disable vsync. In OGL
I did it quite simply by wglSwapInterwalEXT, but since OSG viewer sets up a
display, window etc. for me I thing there should be a way to let him know that
I want vsync disabled. Maybe, since this is my 3rd post I can finaly add some
code, which I "stole" from osgwindow example:
Code:
Viewer viewer;
// left window + left slave camera
{
osg::ref_ptr<osg::GraphicsContext::Traits> traits = new
osg::GraphicsContext::Traits;
traits->x = 40+0;
traits->y = 40+0;
traits->width = 600;
traits->height = 480;
traits->windowDecoration = true;
traits->doubleBuffer = true;
traits->vsync = false;
traits->sharedContext = 0;
osg::ref_ptr<osg::GraphicsContext> gc =
osg::GraphicsContext::createGraphicsContext(traits.get());
osg::ref_ptr<osg::Camera> camera = new osg::Camera;
camera->setGraphicsContext(gc.get());
camera->setViewport(new osg::Viewport(0,0, traits->width,
traits->height));
GLenum buffer = traits->doubleBuffer ? GL_BACK : GL_FRONT;
camera->setDrawBuffer(buffer);
camera->setReadBuffer(buffer);
// add this slave camera to the viewer, with a shift left of the
projection matrix
viewer.addSlave(camera.get(), osg::Matrixd::translate(0.0,0.0,0.0),
osg::Matrixd());
}
DisplaySettings::instance()->setMinimumNumStencilBits( 8 );
viewer.addEventHandler( new osgViewer::StatsHandler );
viewer.setSceneData( ss );
viewer.realize();
viewer.run();
------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=31338#31338
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org