Hi, all:
In my application, I set a rtt camera with a texture2d object attached whose
size is 4096*4096, and then I called tex2d->getTextureObject(contextID) to
obtain the texture. But as a result, the memory keeps increasing 60MB per
second. It's an MFC program. I wonder if there's any point need pay attention
to when using rtt camera? Or if anyone could give me clues? Is it possible that
I modified some global settings unconsciously which causes this behavior?
By the way, in another console application with an rtt camera created using
the same method, memory doesn't increase.
RTT creating code is as following:
camera = new osg::Camera;
camera->setReferenceFrame(ABSOLUTE_RF_INHERIT_VIEWPORT);
camera->seClearMask(GL_DEPTH_BUFFER_BIT|GL_COLOR_BUFFER_BIT);
camera->setClearColor(osg::Vec4(-1000,-1000,-1000,1.0));
camera->setViewport(0,0,texturesize, texturesize);
camera->setRenderOrder(PRE_RENDER);
camera->setRenderTargetImplementation(FRAME_BUFFER_OBJECT);
camera->attach(COLOR_BUFFER, heightMap);
camera->setAllowEventFocus(false);
camera->setCullingActive(false);
camera->setImplicitBufferAttachmentMask(0,0);
Thank you.
------------------
Failure is the mother of success.
Wu Zhicheng
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org