Hi Werner, The values are averaged as otherwise they'd jump around so much that they would be un-readable blur.
The exact values are all collated in the the various osg::Stats objects, so perhaps you can have a look at these. As a general note, OpenGL timing data is not deterministic, this is partly down to the OpenGL FIFO blocking at different points in the frame, or sometimes the OS and the driver interacting in different ways each frame. This effect gets worse the more overloaded your system is. This means you need to take the stats as hint to what is going on, but it's not hard fast, to really understand you need to be aware of how various things interact. Robert. Robert. On Tue, 17 Jul 2018 at 10:43, Werner Modenbach <werner.modenb...@texion.eu> wrote: > > Hi all, > > I'm trying to optimize the display speed of my application by testing > various techniques. > > In order to do exact measures I implemented a mechanism where I can > trigger single frame() calls from my keyboard. > > Curiously the GPU times shown by the stats handler are always drifting > somehow over significant ranges > before getting stable. Does anybody know if there is some averaging over > a couple of frames or is there any other effect > I don't see at the moment? Or is this a kind of caching effect? > > It's difficult to judge the effects of techniques under those > circumstances if I don't have an idea what is causing the drifts. > > Thanks for any hints. > > - Werner - > > > _______________________________________________ > osg-users mailing list > osg-users@lists.openscenegraph.org > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org _______________________________________________ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org