Re: [osg-users] Frame syncing over multiple contexts

2012-01-20 Thread Robert Osfield
Hi John,

On 19 January 2012 20:58, John Kelso ke...@nist.gov wrote:
 OK then! This is getting good!

 I tried setting setEndBarrierPosition(BeforeSwapBuffers), setting
 setThreadingModel(CullDrawThreadPerContext), and running with four windows,
 each with a single camera, on a desktop system with a single graphics card,
 and the problem didn't go away.

 But should the problem go away in this environment?

Without swap ready and gen lock you can't perfectly sync the displays.
 swap ready is a signal that all the rendering on a graphics card has
completed and is ready to swap.  gen lock is a hardware sync of the
refresh itself that ensures that the scan for each display occurs at
the same time.

You can approximate a swap ready by doing a glFlush() or glFinish()
prior to the swap barrier.  The osg::BarrierOperation (implemented in
include/osg/GraphicsThread) has support for doing either of glFlush or
glFinish() prior to joining the barrier, but... currently osgViewer
defaults to setting up the BarrierOperation with both of these off,
just checking the source I see there isn't a access method for getting
the end barrier.  You could add this (subclassing from
osgViewer::Viewer would give you a way of adding this), once you have
the end barrier you'll be able to set the
BarrierOperation::_preBlockOp.  osgViewer::ViewerBase really should
have this built in though so please feel free to add and submit it.

Adding a glFlush() prior to the barrier will just make sure all the
OpenGL fifo is dispatched to the graphics card, so isn't not
everything will have completed on the graphics card before it returns.
 glFinish() prior to the barrier will make sure that the all the
OpenGL fifo is dispatched and the graphics cards has completed all
operations before it returns so you can be sure that things are
properly ready to swap.  There is one gotcha with all these though -
waiting for the OpenGL pipeline to be flushed and completed down on
the graphics card adds a delay on the CPU that breaks the parallelism
between the CPU and GPU so performance can be lower.

Robert.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Frame syncing over multiple contexts

2012-01-20 Thread Jason Daly

On 01/20/2012 04:32 AM, Robert Osfield wrote:

Hi John,

On 19 January 2012 20:58, John Kelsoke...@nist.gov  wrote:

OK then! This is getting good!

I tried setting setEndBarrierPosition(BeforeSwapBuffers), setting
setThreadingModel(CullDrawThreadPerContext), and running with four windows,
each with a single camera, on a desktop system with a single graphics card,
and the problem didn't go away.

But should the problem go away in this environment?

Without swap ready and gen lock you can't perfectly sync the displays.
  swap ready is a signal that all the rendering on a graphics card has
completed and is ready to swap.  gen lock is a hardware sync of the
refresh itself that ensures that the scan for each display occurs at
the same time.


This is four contexts on just one GPU and display, though.  I don't 
think genlock/swap ready would come into play here.  (glFlush/glFinish 
might, though)


On a related note, does anyone know if the NV_swap_group extension 
mentioned earlier would work in the case of multiple contexts on one 
GPU/display, or does it only support synchronizing swap between 
GPUs/displays?


--J


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Frame syncing over multiple contexts

2012-01-19 Thread John Kelso

Hi all,

We have seen the same behavior as Anna in our immersive system. It has four
screens; each screen has a single graphics context and either one or two
cameras (depends on if running in mono or stereo). The system is driven by
an Nvidia quadroplex containing four FX5800 cards, one card per
screen. We're running CentOS Linux.

As a test I tried a configuration with one graphics context containing four
cameras with non-overlapping viewports and in this case the graphics in all
of the viewports appear to be updating at the same time.

As a second test I tried a configuration with four graphics contexts on the
same card, with each graphics context having a single camera. In this case I
could see each window getting updated at a different time.

I also tried setting the traits-swapGroupEnabled value to true but nothing
changed.

So as far as I can tell we are syncing swaps within a graphics context, but
not between graphic contexts. At least that's how I interpret what I'm seeing.

This may or may not be relevant, but we use one osgViewer::Viewer object
and all of the cameras we use are slave cameras of the master camera in the
viewer. Our graphics loop just calls osgViewer::Viewer::frame().

I see some methods in the osgViewer::ViewerBase class that might be relevant
to the problem, but I'm unclear about which ones to set to what value.

Any suggestions?

Many thanks,

John


Hi Anna, Robbert,
I think the bufferswaps on window are by default not synchronized, a
call to
wglJoinSwapGroupNV(HDC hdc, GLuint group) is needed to make different
windows synchronize.
the osg lib has the code to make the call, just set
 traits-swapGroupEnabled = true;
before
 createGraphicsContext(traits);


Output should look like: (set OSG_NOTIFY_LEVEL=INFO)
GraphicsCostEstimator::calibrate(..)
GraphicsWindowWin32::setSyncToVBlank on
GraphicsWindowWin32::wglJoinSwapGroupNV (0) returned 1
GraphicsWindowWin32::wglBindSwapBarrierNV (0, 0) returned 0

the wglBindSwapBarrierNV  fails if you don't have a gsync card (hardware
connection card for multiple graphics cards)

Still, as Robbert says, a single graphics window is likely to perform
better, and is of course automatically in sync. But I
suppose you don't want fullscreen with stereo mode VERTICAL_SPLIT.

Laurens.

On 1/16/2012 10:17 AM, Robert Osfield wrote:

Hi Anna,

This should work out of the box - the two windows should be rendering
and synchronised.  How are you setting up your viewer and graphics
contexts?

A a general note, if you are using a single graphics card for best
performance one usually tries to use a single graphics window and have
two cameras or more share this context.  Is there a reason why you
need two separate windows rather than a single window with two views?

Robert.

On 14 January 2012 21:21, Anna Sokolannaso...@gmail.com  wrote:

Hi,

I am trying to figure out how to keep multiple graphics contexts in frame
sync.

My operating system is Windows XP SP3.
My graphics is NVidia Quadro NVS 290 with the latest driver 276.42.
I'm using OpenSceneGraph 3.0.1 compiled with Visual C++ 2005 for win32.

I have vsync on in the driver and in Traits, also I am using a
CullDrawThreadPerContext as the threading model.
I have 2 graphics windows with separate contexts showing the same scene with
a left and right view on one display.
I have the scene moving across both windows so that I can see if its
properly syncing.
It sometimes visibly looks to be a number of frames out of sync (i.e. one of
the rendered context is dragging behind).
What could be causing this? In the threads? Or down in the graphics card?
Is there any specific settings I should set to make the rendered contexts
stay in frame sync?


Regards,
Anna Sokol

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Frame syncing over multiple contexts

2012-01-19 Thread Robert Osfield
Hi John,



On 19 January 2012 16:32, John Kelso ke...@nist.gov wrote:
 I also tried setting the traits-swapGroupEnabled value to true but nothing
 changed.

swapGroupEnabled is only currently used under GraphicsWindowWin32 so
won't effect linux systems in any way.

 So as far as I can tell we are syncing swaps within a graphics context, but
 not between graphic contexts. At least that's how I interpret what I'm
 seeing.

 This may or may not be relevant, but we use one osgViewer::Viewer object
 and all of the cameras we use are slave cameras of the master camera in the
 viewer. Our graphics loop just calls osgViewer::Viewer::frame().

As long as your run the viewer multithreaded the OSG will use a
barrier so that each graphics thread waits at the end of draw
dispatch, then once all the threads join this barrier then all move on
together and then call swap buffers.  This is done to try and achieve
synchronized swapping, however, it's not a full proof scheme as it
doesn't use any low level driver and hardware synchronization.

Extensions to some OpenGL drivers exist to enable the low level
synchronisation, such as swap groups, swap ready and gen lock.
GraphicsWindowX11 doesn't have support for these extension.  One could
potentially add this into an existing application via a custom
osg::Camera::FinalDrawCallback.  It would be nice to have support for
these extensions in the core though so feel free to pitch in ;-)

Robert.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Frame syncing over multiple contexts

2012-01-19 Thread Paul Martz

On 1/19/2012 9:57 AM, Robert Osfield wrote:

As long as your run the viewer multithreaded the OSG will use a
barrier so that each graphics thread waits at the end of draw
dispatch, then once all the threads join this barrier then all move on
together and then call swap buffers.


Hi Robert -- The default value for ViewerBase::_endBarrierPosition appears to be 
AfterSwapBuffers. Does John need to change this to BeforeSwapBuffers in order to 
get the behavior you describe above?


Thanks,
   -Paul



___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Frame syncing over multiple contexts

2012-01-19 Thread Robert Osfield
Hi Paul,

On 19 January 2012 18:48, Paul Martz pma...@skew-matrix.com wrote:
 Hi Robert -- The default value for ViewerBase::_endBarrierPosition appears
 to be AfterSwapBuffers. Does John need to change this to BeforeSwapBuffers
 in order to get the behavior you describe above?

Man I'm impressed, I'd forgotten implementing the EndBarrierPosition
and the default. I presume I set the default to AfterSwapBuffers to
avoid the possible performance drop in waiting for syncing the swap
buffers dispatch.

John should indeed change EndBarrierPosition to BeforeSwapBuffers using:

  viewer.setEndBarrierPosition(osgViewer::Viewer::BeforeSwapBuffers);

;-)

Robert.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Frame syncing over multiple contexts

2012-01-19 Thread John Kelso

OK then! This is getting good!

I tried setting setEndBarrierPosition(BeforeSwapBuffers), setting
setThreadingModel(CullDrawThreadPerContext), and running with four windows,
each with a single camera, on a desktop system with a single graphics card,
and the problem didn't go away.

But should the problem go away in this environment?

We'll get a chance to test the same fix in our immersive environemnt soon
and I'll report back.

Many thanks,

John

On Thu, 19 Jan 2012, Robert Osfield wrote:


Hi Paul,

On 19 January 2012 18:48, Paul Martz pma...@skew-matrix.com wrote:

Hi Robert -- The default value for ViewerBase::_endBarrierPosition appears
to be AfterSwapBuffers. Does John need to change this to BeforeSwapBuffers
in order to get the behavior you describe above?


Man I'm impressed, I'd forgotten implementing the EndBarrierPosition
and the default. I presume I set the default to AfterSwapBuffers to
avoid the possible performance drop in waiting for syncing the swap
buffers dispatch.

John should indeed change EndBarrierPosition to BeforeSwapBuffers using:

 viewer.setEndBarrierPosition(osgViewer::Viewer::BeforeSwapBuffers);

;-)

Robert.

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Frame syncing over multiple contexts

2012-01-16 Thread Robert Osfield
Hi Anna,

This should work out of the box - the two windows should be rendering
and synchronised.  How are you setting up your viewer and graphics
contexts?

A a general note, if you are using a single graphics card for best
performance one usually tries to use a single graphics window and have
two cameras or more share this context.  Is there a reason why you
need two separate windows rather than a single window with two views?

Robert.

On 14 January 2012 21:21, Anna Sokol annaso...@gmail.com wrote:
 Hi,

 I am trying to figure out how to keep multiple graphics contexts in frame
 sync.

 My operating system is Windows XP SP3.
 My graphics is NVidia Quadro NVS 290 with the latest driver 276.42.
 I'm using OpenSceneGraph 3.0.1 compiled with Visual C++ 2005 for win32.

 I have vsync on in the driver and in Traits, also I am using a
 CullDrawThreadPerContext as the threading model.
 I have 2 graphics windows with separate contexts showing the same scene with
 a left and right view on one display.
 I have the scene moving across both windows so that I can see if its
 properly syncing.
 It sometimes visibly looks to be a number of frames out of sync (i.e. one of
 the rendered context is dragging behind).
 What could be causing this? In the threads? Or down in the graphics card?
 Is there any specific settings I should set to make the rendered contexts
 stay in frame sync?


 Regards,
 Anna Sokol
 _
 Once we accept our limits, we go beyond them.  -- Albert Einstein

 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Frame syncing over multiple contexts

2012-01-16 Thread Laurens Voerman

Hi Anna, Robbert,
I think the bufferswaps on window are by default not synchronized, a 
call to
wglJoinSwapGroupNV(HDC hdc, GLuint group) is needed to make different 
windows synchronize.

the osg lib has the code to make the call, just set
traits-swapGroupEnabled = true;
before
createGraphicsContext(traits);


Output should look like: (set OSG_NOTIFY_LEVEL=INFO)
GraphicsCostEstimator::calibrate(..)
GraphicsWindowWin32::setSyncToVBlank on
GraphicsWindowWin32::wglJoinSwapGroupNV (0) returned 1
GraphicsWindowWin32::wglBindSwapBarrierNV (0, 0) returned 0

the wglBindSwapBarrierNV  fails if you don't have a gsync card (hardware 
connection card for multiple graphics cards)


Still, as Robbert says, a single graphics window is likely to perform 
better, and is of course automatically in sync. But I

suppose you don't want fullscreen with stereo mode VERTICAL_SPLIT.

Laurens.

On 1/16/2012 10:17 AM, Robert Osfield wrote:

Hi Anna,

This should work out of the box - the two windows should be rendering
and synchronised.  How are you setting up your viewer and graphics
contexts?

A a general note, if you are using a single graphics card for best
performance one usually tries to use a single graphics window and have
two cameras or more share this context.  Is there a reason why you
need two separate windows rather than a single window with two views?

Robert.

On 14 January 2012 21:21, Anna Sokolannaso...@gmail.com  wrote:

Hi,

I am trying to figure out how to keep multiple graphics contexts in frame
sync.

My operating system is Windows XP SP3.
My graphics is NVidia Quadro NVS 290 with the latest driver 276.42.
I'm using OpenSceneGraph 3.0.1 compiled with Visual C++ 2005 for win32.

I have vsync on in the driver and in Traits, also I am using a
CullDrawThreadPerContext as the threading model.
I have 2 graphics windows with separate contexts showing the same scene with
a left and right view on one display.
I have the scene moving across both windows so that I can see if its
properly syncing.
It sometimes visibly looks to be a number of frames out of sync (i.e. one of
the rendered context is dragging behind).
What could be causing this? In the threads? Or down in the graphics card?
Is there any specific settings I should set to make the rendered contexts
stay in frame sync?


Regards,
Anna Sokol
_
Once we accept our limits, we go beyond them.  -- Albert Einstein

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org