Hello to all!
I have a simple question.
I want to render a two cameras (two different render surfaces) by exactly
the same scene.
But the two cameras are with different sizes.
I.e. the two cameras has to show me the same picture, but not the same size.
1. I'm using CameraNode and frame buffer object in order to render the scene
to the texture.
2***. I use the same texture for two cameras.
3. If I use RenderImplementationTarget as image then everything is OK.
    But if I use RenderImplementationTarget as texture then only one camera
shows me the texture.

Is there any way to use the same texture rendered to frame buffer object for
the two cameras without copying it?
Or maybe there is another way to do this purpose?

Thank you a lot!

*** This is how I am doing it:

                                                          root
                                                   /                 \
                                                  /                   \
                                       CameraNode        polygon
                                               |
                                               |
                                           scene

CameraNode has a mask (only main camera traverses the CameraNode)
Both cameras render the polygon with the prerenderred (by frame buffer
object) texture.


===============================
  Igor Naigovzin
  Technion - CS and Biology Student
  Rafael - Software Developer
  email: [EMAIL PROTECTED]
===============================
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to