Hi Robert,

I noticed one missing feature in OSG and I would be glad to discuss the 
solution with you before making further steps.

The problem:

- In cull phase, it would be useful to be able to find out which OpenGL 
extensions are supported and GLSL version available in the graphics context. 
Why anybody might need it? The programmer might want to cull different scene 
graphs for contexts with different capabilities. For example, if target 
platform supports two sided stencil, it might cull one scene graph (faster and 
simpler graph), while providing another scene graph if only single side 
stenciling is available. The same for geometry shaders - they might be 
supported through GL_ARB_geometry_shader4 extension - requiring their own 
shader code, through OpenGL 3.2 with different shader code, or programmer might 
provide still different graph for OpenGL 4.0 geometry shader, profiting from 
multiple invocations capability introduced in GLSL 4.0. However, to make this 
work, we need to get graphics context capabilities in cull phase.

Proposed solutions (for you to judge and possibly suggest the best approach):

- To avoid extensive changes to OSG, I think, we can use the existing approach 
of "Extensions" nested class, like Drawable::Extensions, Stencil::Extensions, 
etc. The advantage of the approach is that the user may detect whatever he may 
want or think of inside Extensions::setupGLExtensions() while the users are 
already familiar with this concept. The problem with the Extensions class is 
that it is not initialized until the first getExtensions() call inside draw 
phase. Usually it means that  we can not get valid Extensions instance in the 
first frame cull phase. And it is still not guaranteed to be initialized for 
the second frame.

- My idea, and I ask your counsel, was to auto-initialize all *::Extensions 
instances for any new graphics context that is realized in the application. If 
this might me overkill for some reason, we might provide some per-graphics-
context flag for the user and he might choose which context should use this 
functionality and which should not.

- To implement such functionality, we might use proxy approach and register 
all *::Extensions classes in some global list. Then, we might, for instance, 
go through the list and call setupGLExtensions() for all registered 
*::Extensions classes. Such procedure would be done whenever a context is 
realized.

- Another approach might be to give the user the possiblity to register 
GraphicsContext::realizeCallback (something like swapCallback but for 
realize). This way, the user may initialize required *::Extensions classes 
himself. The disadvantage of this approach is that the user would be required 
to write some code (the initialization code and callback registration). The 
proxy approach mentioned above would do the job for him automatically.

Before discussing implementation details, do you think that the proposal is 
reasonable or would you prefer different approach for the problem? Essentially, 
it is about ability to know graphics context capabilities during cull phase to 
be able to cull the scene graph that is the best suitable for the graphics 
context.

Thanks for the answer and keeping OSG growing.
John
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to