Hi,
 
I am experiencing VERY bad performance (i.e 1 FPS) when I enable precipitation 
on my Dell laptop (XPS 1710) that uses a GeForce 7950 graphic card (dell driver 
version 179.xx).

But here's the kicker... the performance is fine when running on any other 
desktop systems that has a more recent graphic card (for example a 8800 GTX). I 
kinda understand that it would run faster, but never expecte a drop to 1 FPS on 
a 7950, especially when there is ABSOLUTELY nothing in the scene. 

I've reduced my application to the bare minimum to isolate the problem. All I 
have in my scene graph is a pre- post- camera used to render my scene to a 
texture (using the render-target-implememtation functions of the osg::camera), 
and an instance of precipitation effect. 

I tried using different render-target-implementation technique (like 
FRAME_BUFFER) but that did not change anything. 

If I bypass the render-target-implementation (i.e I bypass the 
osg::camera::attach() method), keep the pre-camera, and remove the post-camera, 
then everything work fine (running at 60fps). 

Can someone shed some light on why the the render-target-implementation would 
affect osg::PrecipitationEffect (or vice-versa).

Note: The problem is not the precipitation effect because I compared my 
implementation with the osg precipitation example and they are the same. When I 
run the example the frame rate is 60fps (which is the same as my app when I 
bypass the rendet to target). 

Cheers,
Guy

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=25603#25603





_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to