Hi Endre,
On Thu, Apr 2, 2009 at 12:56 PM, Endre Lidal <[email protected]> wrote:
> Yes, that is correct. What I did was to run the my application unchanged on
> the multi-GPU computer, expand the application window by dragging the window
> sidebar cross all 4 displays. Everything except the textures that are
> affected by my subloading is working. For instance my "skybox" is rendered
> correctly on all 4 displays. B.t.w. it is an xwWidget application if it
> matters.
If you are using a single window then you'll have a single graphics context,
but the OS/OpenGL driver will be cloning it in some way the kid on that you
have a single context across both contexts. While this might seem
convenient it sucks for performance.
The right way to do multiple graphics cards to open up a single graphics
context for each graphics card, and use the OSG's master/slave camera
support in osgViewer to do all the threading/context management. I've
heard from Windows users that Windows might still be cloning contexts
though.
Personally I'd recommend using unix to do multi context work as X11 provides
much better specific context of contexts assignment to GPUs.
> I was thinking about the graphics contexts too, but I added a printout to
> my sub-load callback, something like this:
>
>
> Code:
> ...
> void
> MySubloadCallback::subload(const osg::TextureRectangle& texture,osg::State&
> state) const {
> const osg::Image* img = texture.getImage();
> const unsigned int contextID = state.getContextID();
> printf("ContextID = %d", contextID);
> ...
>
>
>
> But I only get "ContextID = 0" printed. This makes me suspect that I have
> only one graphics context.. I'm I right? This is where I cannot find any
> documentation (or examples) on where to look. Sorry if I'm a bad searcher.
> Is there a guide for how to setup multiple grapichs contexts (if I need
> them) across multiple GPUs? I used to work on SGI computers and multi-pipe
> rendering in the old days and maybe I'm not understanding how multi-GPU
> nowadays differs from that. Is for instance the driver handling all this
> multiple GPU stuff so that osg only sees one large screen?
>
The driver is playing games, hiding the fact that there is really two
graphics contexts doing the work, but only one at the application level. It
sounds like it's mostly doing this approach successfully but fails with your
subloading. So it's basically a driver bug, there won't be anything you can
do about it on the OSG end with your current viewer/graphics context
configuration.
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org