Hello All.

I don't normally like posting this kind of plea for help, but I'm completely
stuck.

I am using OSG1.2, to do HDR RTT. I have a CameraNode (PRE_RENDER) with a
RGB32F texture attached. Under this I have something simple - i.e. the OSG
shiny cow. Alongside, I also have a NESTED_RENDER absolute projection
camera, with a full-screen quad with the RTT texture on it. The quad has no
tone-mapping shader at all. So far, pretty normal. Performance is fine and
stable.

However, if I add fog to the cow's stateset, performance gets much much
worse; in fact, as you zoom in on the cow (i.e. as the cow occupies more of
the texture) the frame rate gets lower and lower until it's effectively
stopped. If I remove the RTT bits from the SG - i.e. detach the RGB32F
texture, don't draw the quad - everything is fine. If the RTT texture is
RGB8, everything is fine.

This is all buried in a complex application; I haven't yet been able to
duplicate this behaviour based on one of the examples, but I was hoping
somebody might be able to tell me where to start looking. I have the
following possibilites, in order of likelihood. I guess that it could be
driver issues, or application memory management (e.g. not using ref_ptrs
where I should have been, not clearing up, array bounds, that sort of
thing), or the lack of a tone-mapping shader doing something funny
on-card...

I don't think it's worth me explaining any more, as its so app dependent,
and consequently I'd quite understand if I got no replies at all. However,
if anybody has seen this before, or has any ideas of where to start looking,
or has any useful diagnostic tricks, I'd be grateful.

Thanks,

David
_______________________________________________
osg-users mailing list
[email protected]
http://openscenegraph.net/mailman/listinfo/osg-users
http://www.openscenegraph.org/

Reply via email to