Hi Andrew,

On Fri, Dec 24, 2010 at 3:52 PM, Andrew Cunningham
> When trying to fix this, I modified OSG and "chunked" up the calls to 
> glDrawElements inside PrimitiveSet.cpp to be  GL_MAX_ELEMENTS_INDICES in 
> size. Anything wrong with this approach?

It depends upon how you implemented it.  You certainly wouldn't want
to do a glGet for GL_MAX_ELEMENTS_INDICES on every frame, let along
call to do a single primitive set. It's also questionable whether we
should be doing extra checks on size to catch cases when users are
passing data to the OSG that can't be rendered with OpenGL.

The OSG isn't your nanny, it assumes that you know what scene graph is
required to render properly on your system and doing so doesn't burden
the OSG with extra checks that are only required for scene graphs that
have inappropriate sizes.

The best approach for any checking this this is to do it in a data
preparation and optimization when you create the data or load it into
memory.

>This would mean the OSG user need not be concerned about whether they are 
>using 'too many' vertices/primitives.

But it would mean that everyone pays in performance for the benefit of
a tiny number of developers not building scene graphs appropriate for
there target hardware.

Robert.
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to