Hi all

New to OSG so apologies if this is a stupid question.

I have written a visualization tool in OSG and as a last step, I wish to
ensure that it runs at the screen refresh rate. My perception was that it
mostly runs smoothly but stutters every few frames or so, so I wrote code
to print the elapsed time between every two frames, rather than just the
average framerate. The printout seemed to confirm this, so I gradually
removed more and more of my scenegraph, trying to determine where the
bottleneck lies. The weird thing is that I end up with a completely empty
scenegraph (just rendering a blue screen) and execution time STILL seems to
stutter. So, I strongly suspect that there is something fundamentally wrong
with the way I am measuring. I realise that FPS is a bad performance
metric, but right now I just want to get to the point where I am updating
every frame on my hardware and I am not particularly concerned about
performance on any other hardware. The remaining code looks like this:

int main() {
ref_ptr<osgViewer::CompositeViewer> viewer=new
osgViewer::CompositeViewer(); ref_ptr<osgViewer::View>
view1 = new osgViewer::View;
float timeNow,timePrev=0;
while (!viewer->done())
std::cout << timeNow-timePrev << " ";

and the resulting printout to the console looks like this:

0.123003 0.00298501 0.000751987 0.016045 0.083418 0.038128 0.013075
0.00068298 0.014678 0.019825 0.013804 0.016688 0.01651 0.066802 0.023197
0.07995 0.013019 0.000387967 0.0331 0.000599027 0.03273 0.000575006
0.088067 0.083023...

As you can see, the loop occasionally takes as much as 88ms to complete,
while at other time completing within 0.6ms. What could possibly be causing
this massive variation?

Hannes Naude
osg-users mailing list

Reply via email to