Art, Here are the link to the MPV project: CIGI website: http://cigi.sourceforge.net/ MPV website: http://cigi.sourceforge.net/product_mpv.php
>From what I can see (and I am not an expert) there is no direct use of >osg::Texture, UnitTexture or osg::Camera. It seems to me that osg::Geode, >Geometry and StateSet is used most places where rendering is done. As well the scene graph looks something like this, before adding much data in there: RootNode EntityBranchNode Entities... TerrainBranchNode Terrain info... SkyBranchNode Sky info... If I do a sceneGraphDump when there a terrain and other stuff loaded I get a really long dump but I still can't see any information about cameras or such... What I do get from the MPV which is related to the rendering is: In a RenderCameraCigiSDLOSG plugin they handle the viewport and get the sceneView as such: osgUtil::SceneView *sceneView = viewport->getSceneView(); (would need to modify the current MPV code to get a hold of this) What I can get a hold of directly is: //========================================================= //! The camera matrices, one for each view. Each one of these is a //! complete modelview matrix. Retrieved from the blackboard. //! std::map< int, osg::Matrix > *cameraMatrixMap; //========================================================= //! Map containing the view group parameters (provides additional //! view transformation, etc). Retrieved from the blackboard. //! std::map< int, ViewGroupParams * > *viewGroupParamsMap; //========================================================= //! Map containing the view parameters (used for frustum setup, //! entity-following, etc). Retrieved from the blackboard. //! std::map< int, XL::XLPtr<View> > *viewMap; Again thanks for your help! Patrick Hi Patrick, osgPPU does use the camera only in to get its texture (so the texture to which the scene is rendered). You do not require a camera at all, you can use UnitTexture to provide osgPPU pipeline with any texture (or with the texture you get from MPV). To provide a texture you need some protocol to get the texture from MPV into osg::Texture container. Have you some link to the MPV, so that I can get more info about it? How are you managing the combination of osg and mpv? cheers Art, >From what I can understand, it uses cameras, views and viewports. I believe >mostly from the osgSDLView, but it also uses "xlsignal"? to pass the pointers >around. The way MPV works at the hi level is that a single window is defined >(per process). This window makes use a a viewer which possibly has multiple >viewports. (I guess this is pretty normal...) The views are defined in a >definition file and the information driving them comes through the CIGI >protocol. Can I use the viewports and apply the osgPPU> processors to >it(them) or do I have to attach the processor Patrick Castonguay
_______________________________________________ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org