Re: [osg-users] Texture missing when adding slaves dynamically to osgViewer

2009-10-09 Thread Drolet, Frederic
Hello,

I tried with Texture::setUnRefImageDataAfterApply(false) and it works well. 
However, as I read about this, texture memory is now duplicated (once in OpenGL 
and once in OSG). Isn't there a way to do the same thing in OpenGL by sharing 
the contexts or something like that? As I said, I tried to share a single 
context in the traits configuration but it didn't work. For now, our 
application doesn't use too much memory but this could become a problem when 
we'll be generating visual data from our database!

As for the osgUtil::Optimizer, we're not using it anywhere in our code... Is it 
called by the Viewer class during initialization or something?

Would there be another way to enable texture sharing for dynamically created 
rendering contexts while optimizing memory usage?

Thanks again for your help!

Frédéric Drolet


-Original Message-
From: osg-users-boun...@lists.openscenegraph.org 
[mailto:osg-users-boun...@lists.openscenegraph.org] On Behalf Of Robert Osfield
Sent: October-08-09 5:01 PM
To: OpenSceneGraph Users
Subject: Re: [osg-users] Texture missing when adding slaves dynamically 
toosgViewer

Hi Frederic,

If you are creating new graphics contexts and applying and old scene
graph to it then you can't use the
Texture::setUnRefImageDataAfterApply(true) feature of osg::Texture as
this will discard the imagery once it's applied to all the graphics
contexts that it knows about.  By default the osgUtil::Optimizer will
switch this on to save memory, so try not calling the Optimizer to see
if makes a difference.  It's possible that the original database also
has this options set, but for most databases it'll be off, which is
the default.

Robert.

On Thu, Oct 8, 2009 at 8:21 PM, Drolet, Frederic
frederic.dro...@drdc-rddc.gc.ca wrote:
 Hello,



 I'm having trouble with textures on slave cameras added to an osgViewer.
 Textures won't appear if I add the slaves after a first call to
 osgViewer::frame().



 My application is composed of a rendering thread calling osgViewer::frame()
 every 15 ms (for a 60 Hz framerate) and a main thread handling windows and
 menus interactions (using MFC on Windows). One of those interactions is to
 add and remove camera slaves on the go (adding a projection and camera
 offset for multiple points of view). Here's the steps I follow to add a
 slave camera:



 · Pause my rendering thread calling osgViewer::frame() and wait for
 it to be idle;

 · Call osgViewer::stopThreading() to make sure the last frame is
 done drawing;

 · Create a child window with its own graphics context;

 · Add a slave to osgViewer using the newly created window handle
 (each slave camera uses its own osg::GraphicsContext object);

 · Call osgViewer::realize() to reinitialize the viewer and start
 threading again;

 · Unpause my rendring thread which starts calling osgViewer::frame()
 again.



 I use a similar approach to destroy slaves. Everything works fine except for
 the textures which are not displayed on the slave windows (but I can see the
 primitives).



 Note that if I add slaves before the first call to osgViewer::frame(),
 textures are ok. But removing and adding them again makes the textures
 disappear.



 I tried all the threading models in osgViewer, I also tried to share the
 master context in the osg::GraphicsContext::Traits object of every slave.
 None of those solutions is working. My comprehension of OpenGL state sets is
 limited so I'm probably missing something here.



 What am I doing wrong? Is adding slaves dynamically to an osgViewer even
 possible?



 Thanks for your help!



 Frederic Drolet, M. Sc.

 Computing Solutions and Experimentations | Solutions informatiques et
 expérimentations

 Systems of Systems | Systèmes de systèmes

 DRDC Valcartier | RDDC Valcartier

 2459, boul. Pie-XI North

 Quebec, Quebec

 G3J 1X5 CANADA

 Phone | Téléphone: (418) 844-4000 ext : 4820

 Fax | Télécopieur: (418) 844-4538

 E-mail | Courriel: frederic.dro...@drdc-rddc.gc.ca

 Web : www.valcartier.drdc-rddc.gc.ca

 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Texture missing when adding slaves dynamically to osgViewer

2009-10-09 Thread Robert Osfield
Hi Frederic,

 I tried with Texture::setUnRefImageDataAfterApply(false) and it works well. 
 However, as I read about this, texture memory is now duplicated (once in 
 OpenGL and once in OSG). Isn't there a way to do the same thing in OpenGL by 
 sharing the contexts or something like that? As I said, I tried to share a 
 single context in the traits configuration but it didn't work. For now, our 
 application doesn't use too much memory but this could become a problem when 
 we'll be generating visual data from our database!

It's possible to share contexts in the OSG, I have no clue as to why
it hasn't worked in your case, there is just too many unknowns -  you
have your code, I don't so you're the only one really in a position to
debug it.

As for general desirability of share GL objects between contexts, yes
it can reduce memory usage, but it forces you to use the OSG single
threaded otherwise two contexts will be contended for the same
resources that deliberately aren't mutex locked for performance
reasons.   There is also on a limited set of cases where
drivers/hardware will actually share OpenGL contexts.

 As for the osgUtil::Optimizer, we're not using it anywhere in our code... Is 
 it called by the Viewer class during initialization or something?

The Viewer doesn't run the Optimizer.  Some plugins run it on their
own data though.

 Would there be another way to enable texture sharing for dynamically created 
 rendering contexts while optimizing memory usage?

?? Sounds a bit like a magic wand. OpenGL only allows you to share all
OpenGL objects or none, you don't get to share some.

If you want to tightly manage the OpenGL memory footprint then the new
texture + buffer object pool is what you'll want to use.

Robert.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] Texture missing when adding slaves dynamically to osgViewer

2009-10-08 Thread Robert Osfield
Hi Frederic,

If you are creating new graphics contexts and applying and old scene
graph to it then you can't use the
Texture::setUnRefImageDataAfterApply(true) feature of osg::Texture as
this will discard the imagery once it's applied to all the graphics
contexts that it knows about.  By default the osgUtil::Optimizer will
switch this on to save memory, so try not calling the Optimizer to see
if makes a difference.  It's possible that the original database also
has this options set, but for most databases it'll be off, which is
the default.

Robert.

On Thu, Oct 8, 2009 at 8:21 PM, Drolet, Frederic
frederic.dro...@drdc-rddc.gc.ca wrote:
 Hello,



 I’m having trouble with textures on slave cameras added to an osgViewer.
 Textures won’t appear if I add the slaves after a first call to
 osgViewer::frame().



 My application is composed of a rendering thread calling osgViewer::frame()
 every 15 ms (for a 60 Hz framerate) and a main thread handling windows and
 menus interactions (using MFC on Windows). One of those interactions is to
 add and remove camera slaves on the go (adding a projection and camera
 offset for multiple points of view). Here’s the steps I follow to add a
 slave camera:



 · Pause my rendering thread calling osgViewer::frame() and wait for
 it to be idle;

 · Call osgViewer::stopThreading() to make sure the last frame is
 done drawing;

 · Create a child window with its own graphics context;

 · Add a slave to osgViewer using the newly created window handle
 (each slave camera uses its own osg::GraphicsContext object);

 · Call osgViewer::realize() to reinitialize the viewer and start
 threading again;

 · Unpause my rendring thread which starts calling osgViewer::frame()
 again.



 I use a similar approach to destroy slaves. Everything works fine except for
 the textures which are not displayed on the slave windows (but I can see the
 primitives).



 Note that if I add slaves before the first call to osgViewer::frame(),
 textures are ok. But removing and adding them again makes the textures
 disappear.



 I tried all the threading models in osgViewer, I also tried to share the
 “master” context in the osg::GraphicsContext::Traits object of every slave.
 None of those solutions is working. My comprehension of OpenGL state sets is
 limited so I’m probably missing something here.



 What am I doing wrong? Is adding slaves dynamically to an osgViewer even
 possible?



 Thanks for your help!



 Frederic Drolet, M. Sc.

 Computing Solutions and Experimentations | Solutions informatiques et
 expérimentations

 Systems of Systems | Systèmes de systèmes

 DRDC Valcartier | RDDC Valcartier

 2459, boul. Pie-XI North

 Quebec, Quebec

 G3J 1X5 CANADA

 Phone | Téléphone: (418) 844-4000 ext : 4820

 Fax | Télécopieur: (418) 844-4538

 E-mail | Courriel: frederic.dro...@drdc-rddc.gc.ca

 Web : www.valcartier.drdc-rddc.gc.ca

 ___
 osg-users mailing list
 osg-users@lists.openscenegraph.org
 http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org