Hello guys,

I am working on a big project which uses OSG 3.0.0 and I have a very strange 
problem. During the initialization of the application I create a temporary 
graphics context in order to perform a series of actions (OGL extension support 
check,3d texture creation etc). When I am done I call 
osg::GraphicsContext::releaseContext() in order to release the temporary 
context. Then I create my scene graph, I create my viewer, I set my scenegraph 
data to the viewer and then I start calling Viewer::frame() function (I do not 
use Viewer::start() because it is a Qt application).

Lets focus on the problem now. After all the initialization and while the 
application runs, I check the memory usage from the windows task manager. If I 
do not create this temporary graphics context during the initialization, the 
memory usage of the application is about 100MB of memory. If I do create this 
temporary graphics context the memory usage is almost 300MB of memory!
I had a closer look at the memory usage during the application initialization:
* Without the temporary graphics context the memory usage goes up to 270MB and 
then suddenly drops to 90MB and then ends up at 100MB. (I guess this is because 
it releases some vertex and texture data from the CPU memory after transferring 
them to the GPU memory).
* With the temporary graphics context I do not have the 270MB->90MB drop. 
During the initialization it climbs up to 300MB and stays there.

Then I realized that this won't happen if I create the temporary graphics 
context AFTER calling the Viewer::frame() function once (aka AFTER creating my 
main graphics context). Why is this happening? Are there some "rules" on when a 
temporary graphics content should be created?

Here is my code for the temporary graphics content:


Code:

class TemporaryGraphicsContext
{
public:
        TemporaryGraphicsContext(void):_id(-1)
        {
                osg::ref_ptr<osg::GraphicsContext::Traits> traits = new 
osg::GraphicsContext::Traits;
                traits->x = traits->y = 0;
                traits->width = traits->height = 1;
                traits->windowDecoration = false;
                traits->doubleBuffer = false;
                traits->sharedContext = 0;
                traits->pbuffer = false;
                _gc = osg::GraphicsContext::createGraphicsContext(traits.get());
                if (_gc.valid())
                {
                        _gc->realize();
                        _gc->makeCurrent();
            _id = _gc->createNewContextID();
                }
        }

        ~TemporaryGraphicsContext() {
                _gc->releaseContext();
        }
        bool valid() const { return _gc.valid() && _gc->isRealized(); }
    inline int getID(void){ return _id; }
public:
        osg::ref_ptr<osg::GraphicsContext> _gc;
    int _id;
};



And here is how I use it:

Code:

{
    TemporaryGraphicsContext tmpGC;
    //  Do stuff
}
//  tmpGC goes up of scope and released automatically(check destructor)




Any ideas what is going on? Thank you for your time guys!

Cheers,
George

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=43064#43064





_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to