I have a question about the latency between when an instruction changing the
scene graph is issued and when the effect of that instruction is visible on
the screen.

I'm using a very simple application (OSG 2.0) with a very simple scene graph
made by a square with a white face and a black face, running at 75 fps with
vsync enabled on the graphic card; ever every 30 frames two instruction are
issued:

1 a rotation instruction provoking a color change from black to white on the
whole screen

2 a character is sent on the serial port

I'm measuring whith an oscilloscope the time elapsed from the beginning of
the electric signal on the serial port and the pattern change in the
electric signal on the VGA Red signal line due to the black to white
transition; the result is that it takes two frames for a change to be
visible on the screen (the measured time is around 26 ms) when OSG viewer is
configured in single thread.

Does someone know why there are two frames of delay instead of one? (please
note that we are not using triple buffering on the graphic card)
_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/

Reply via email to