I know this is probably in the noise but I was profiling my code and noticed 
many calls per frame to clock_gettime(). I traced this back to OSG (I'm using 
OSG 2.9.12 but it is probably true for other version). 

I counted about 245 calls to clock_gettime() per frame with each 
clock_gettime() call taking around 0.2 microseconds. This means that around 49 
microseconds per frame are spent fetching the current time.

In looking at the code, it looks like most (if not all) of the calls are from 
osg::Timer::tick() and used for debugging information which is only reported 
for various debugging notify levels, however, the timing calculation is 
reported no matter what the notify level is.

Again, I know this might be in the noise, but for my project, I need to 
conserve every CPU cycle I can.


Paul P.

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=39545#39545





_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to