Hi Boon Wah,

On 4 September 2012 02:11, Boon Wah <[email protected]> wrote:
>     Just a final question here, is there any way currently to parse a node 
> within GPU itself and store all the node data directly on GPU. Will this be 
> planned for future OSG releases?

The OSG does a pre compile traversal of the whole scene graph during
the first frame, but this only applies to scene graph so far assigned.
 You can force another compile traversal by using
osgViewer::Renderer::setCompileOnNextDraw(true); The Renderer is
assigned to the Camera in the viewer.

>     GPU memory sizes are getting increasingly large, and it will be 
> beneficial if we could do all these on GPU itself, rather than using CPU 
> memory space.

The OSG does all rendering on the GPU and stores data on the GPU when
so instructed by scene graph usage i.e. texture objects, vertex buffer
objects etc.  The OSG also has a feature in Texture that enables the
automatic release of the osg::Image once the image is download to
OpenGL.

There are limits on just how much you can sensibly do on the GPU, and
you can't download data directly from disk to the GPU so you always
have to go through main memory, so it's really just an issue of how
you manage that memory.

The OSG is professional grade scene graph and will already being doing
what that what the majority what the vis-sim industry need from a
scene graph, so I'd recommend you don't too ahead of yourself with
grand plans of what might be.  I'd recommend you just take the time to
learn about what the OSG can do and how it does it, the new OSG books
will help you in this endeavour.

Robert.
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to