Re: [osg-users] Problems with RTT-cameras when changi ng window size
Hi. The cameras are part of the scene graph. I was still using frame buffer based cameras because they seem to be the only way to get a real-time-sufficient performance for more complex applications. I figured out that this causes the problem and found no way to get rid of the error - even when adapting the viewport to window size changes or using other similar approaches. To solve this I now provide the user with the possibility to chose between using frame_buffer or frame_buffer_object according to his needs. The one mode allows to enlarge the window without having to reinitialize the cameras, the other one ist faster. Cheers, Steffen _ Der WEB.DE SmartSurfer hilft bis zu 70% Ihrer Onlinekosten zu sparen! http://smartsurfer.web.de/?mc=100071distributionid=0066 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] Problems with RTT-cameras when changing window size
Hi, I have a little problem with some Render-To-Texture-cameras in connection with a WindowSizeHandler. I took a bit of code out of an application I wrote earlier to put it into a lib and make the functions usable for other apps. When testing this lib with a simple example program consisting of a simple viewer I suddenly have issues when changing the window size of the viewer window - something that didn't happen in the old application. I tried to adjust the parameters of my example viewer to the ones used in the original application but I still get the error so I cannot really find out what caused the problem. It only occurs when making the window bigger then the original size - the originally set size the RTT-cameras where initialized with. I get the following error repeatly: Warning: detected OpenGL error 'invalid value' after applying attribute Viewport 01A56A88 I use the standard WindowSizeHandler. It's really strange that I didn't get the error in the original application. What also is weird is that it isn't connected to simply enlarging the window but only occurs if the window gets bigger than it was originally. Making the window smaller works fine. Has anyone experience with this error message and knows what may be the reason? I know that it must be connected with my RTT-cameras since it disappears if I take them out. Any hint would be appreciated. Thanks, Steffen __ Run, Fatboy, Run sowie Rails Ties kostenlos anschauen! Blockbuster-Gutscheine sichern unter http://www.blockbuster.web.de ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Spatial View SVI autosteroscopic display support.
Hi Chris. I spent quite some time on doing just what you are planning to do - getting an OpenSceneGraph-application to work with a Spatial View display. In the end I was successful. I don't know at the moment if I can provide you with any code but of course I can tell you how I am doing it. I have no idea if this is the best way but it works quite well. I use 5 RTT-cameras to save the five views in textures that later become input for the interlacer. You can position these 5 cameras using the SVI camera API or calculate the view and projection matrixes yourself. You then let the SVI interlacer render the result from the 5 textures. Everytime the master camera (the main osg camera) changes you also have to recalculate the matrixes of your slave cameras. The important part is that you take care on when you do call the API functions. You have to do this in a part of your program where the OpenGL context was set accordingly by OSG. You can use a PostRender-CameraCallback for this. See the osgteapot-example for some code on how to do this. Good luck. Schon gehört? Bei WEB.DE gibt' s viele kostenlose Spiele: http://games.entertainment.web.de/de/entertainment/games/free/index.html ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] performance issues with RTT
Hi, after reading Viggo's post I changed my implementation to use FRAME_BUFFER instead of FRAME_BUFFER_OBJECT and this did the job for me. At least on the fast machine (the one I posted details about earlier) it now runs fast enough without having to make sacrifices resolution-wise and this is all I could hope for at the moment ;) If I find out other ways to speed every up or deal with this problem I will let you know. Thanks, Steffen ___ Jetzt neu! Schützen Sie Ihren PC mit McAfee und WEB.DE. 30 Tage kostenlos testen. http://www.pc-sicherheit.web.de/startseite/?mc=00 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] performance issues with RTT
Good morning, I'm just rendering to texture, no images or pixel reading involved. Regards, Steffen Schon gehört? Bei WEB.DE gibt' s viele kostenlose Spiele: http://games.entertainment.web.de/de/entertainment/games/free/index.html ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] performance issues with RTT
Hi everybody. I have some performance problems with rendering to texture. I am doing some interlacing for an autostereoscopic display. Therefor I need 5 different views on the scene, which I render into textures I later use as input for the interlacing API. But doing the RTT slows down my application significantly (even in Release-mode). I first thought it would be the interlacer that takes that much time but when I disabled all interlacing and everything that updates the cameras after the master camera was moved it didn't really change. Even now, when everything I do is displaying the picture of the master camera and additionally rendering to five textures, it runs very halting. I define 5 target textures and 5 rtt-cameras via // Configure source textures for the interlacer _interlacer-resetIterator(); for(int i = 0; i _interlacer-getNumberOfTextures(); i++){ osg::ref_ptrosg::Texture2D renderTexture = _interlacer-getIteratorTexture(); renderTexture-setTextureSize(_interlacer-getResultTexture()-getTextureWidth(), _interlacer-getResultTexture()-getTextureHeight()); renderTexture-setInternalFormat(GL_RGBA); renderTexture-setFilter(osg::Texture2D::MIN_FILTER, osg::Texture2D::LINEAR); renderTexture-setFilter(osg::Texture2D::MAG_FILTER, osg::Texture2D::LINEAR); _interlacer-incIterator(); } // Configure cameras of the camera model _cameras-resetIterator(); _interlacer-resetIterator(); for(int i = 0; i _cameras-getNumberOfCameras(); i++){ osg::ref_ptrosg::Camera textureCamera = _cameras-getIteratorCamera(); textureCamera-setReferenceFrame(osg::Transform::ABSOLUTE_RF); textureCamera-setClearMask(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); textureCamera-setViewport(0,0, winWidth, winHeight); textureCamera-setRenderTargetImplementation(osg::Camera::FRAME_BUFFER_OBJECT); textureCamera-setRenderOrder(osg::Camera::PRE_RENDER); textureCamera-setGraphicsContext(_billInterface-getGraphicsContext()); textureCamera-setAllowEventFocus(false); textureCamera-addChild(_billInterface-getScene()); textureCamera-attach(osg::Camera::COLOR_BUFFER, _interlacer-getIteratorTexture().get()); _cameras-incIterator(); _interlacer-incIterator(); } The five RTT-cameras are children of a group node, which I add to the root node of my scene graph. Is there a way to speed up the rendering somehow? I could make the resolution of the textures smaller (currently it's 1280x1024) but this would decrease the quality of the final image and I would have to scale the textures before interlacing what probably also takes some time. Thanks for any hints, Steffen Schon gehört? Bei WEB.DE gibt' s viele kostenlose Spiele: http://games.entertainment.web.de/de/entertainment/games/free/index.html ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] performance issues with RTT
Thanks for the hints so far... I gave you wrong information since I forgot that I take care of the NPOT-stuff myself. The API I use later on needs POT-textures so I make sure that only POT-textures are rendered. So in fact my textures are normally 2048x1024. I know that the performance issues are most certain a problem of the texture rendering since we have parts of the application where even huger amounts of (normal render-to-view) cameras are involved without slowing down everything as much. The scene is pretty simple too and runs smoothly without the RTT-cameras. Back on Monday I will try to get some FPS-values for exactly the same camera-setup with and without RTT for different scene complexities (here in Germany the weekend begins now ;)). What I just found out is that the texture sizes make a huge difference in rendering performance. With 256x256-textures my cameras hardly slow down the viewer at all. That's also an evidence that the problem has to be somewhere in the texture-creation-part. If this cannot be enhanced then I probably have to find a good balance between quality and speed. But I still have hopes that there are better ways to improve my performance without having to sacrifice too much resolution-wise. BTW: I'm running my stuff on a Xeon 5160 at 3GHz and 2GB RAM with a Radeon X1900 card. Have a nice weekend, Steffen ___ Jetzt neu! Schützen Sie Ihren PC mit McAfee und WEB.DE. 30 Tage kostenlos testen. http://www.pc-sicherheit.web.de/startseite/?mc=00 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Getting the opengl texture id of an osg::Texture2D
Thanks for the help, everything seems to work fine so far. But now I'm a little bit lost on what the best way for creating an osg::Texture2D out of the ID of an OpenGL-texture is. I have the id of an OpenGL-texture as the result delivered by the API I'm using and I want to get this texture as an osg::Texture2D. I tried creating a new Texture2D, generating a TextureObject by using generateTextureObject and finally setting the _id of this TextureObject to the id of my result texture. But I keep getting errors that way because the generated TextureObject seems to be invalid. Do I have to add another step before setting the id? Or am I completely off and have to do it in a different way? How do I get an osg::Texture2D from my OpenGL-Texture-ID? Regards, Steffen _ In 5 Schritten zur eigenen Homepage. Jetzt Domain sichern und gestalten! Nur 3,99 EUR/Monat! http://www.maildomain.web.de/?mc=021114 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] problems with checking for OpenGL extensions
Hi, I've got another problem with using OpenGL-code in connection with OpenSceneGraph. I have an OpenGL-API that I use in my OSG-application. This API won't work when there are some required OpenGl-extensions missing. Instead it will list the missing extensions. In my application it does just that because it cannot find some of them. Strange thing is that when running the example-programs that came with the API outside of my program everything works just fine. Also according to OpenGL Extension Viewer my hardware should support these features. I read that there is also an OSG-function to test on extensions but since I have no influence on the API I have to find out why its internal checks fail when used in my application. Does anyone have an idea why these problems possibly occur and what I have to change or do about it to make it work? I thought the provided extensions just depend on the graphics driver... is this wrong? Could there be some conflicts with the libraries I'm using? If this already has been answered somewhere I'm sorry. I read through some posts concerning extensions but nothing seemed to fit to my problem. Thanks in advance, Steffen ___ Jetzt neu! Schützen Sie Ihren PC mit McAfee und WEB.DE. 30 Tage kostenlos testen. http://www.pc-sicherheit.web.de/startseite/?mc=00 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] Getting the opengl texture id of an osg::Texture2D
Hi, I cannot find the correct function to get the OpenGL texture-id of an osg::Texture2D. I have several textures in OSG that I want to use as input for a library that needs the textures as GLuint. How do I get my OSG-textures in there? And how can I create an osg::Texture2D out of the GLuint I get as a result after running the API? I would be really happy about any hints on how to do this. Thanks in advance, Steffen ___ EINE FÜR ALLE: die kostenlose WEB.DE-Plattform für Freunde und Deine Homepage mit eigenem Namen. Jetzt starten! http://unddu.de/[EMAIL PROTECTED] ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] render to texture only delivers black texture
Hi everyone. I have some issues with rendering to a texture. I am writing a plug-in for an application. This application has a main-viewer-window, where it displays a scene. My plug-in has an own window with a new viewer and everything. The new window now needs a texture showing what the main window displays right at this moment. I need it to use it later on with an API for interlacing but for now I just want to show it in the plug-in-window to see whether rendering-to-texture works as it should. That is why the plug-in-window show a screen-aligned-quad that displays this texture. If I use an image as texture of the quad everything works as it should, so my plug-in-window should be fine the way it is. But if I use the texture I render using a camera in the main application's scene I just get a black texture. I create the following camera using information of the camera of the main application which I get via an interface named _billInterface and attach the texture: osg::Camera* camera = new osg::CameraNode(); camera-setCullingActive(false); camera-setClearColor(osg::Vec4(1.0f,1.0f,1.0f,1.0f)); camera-setClearMask(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); // Set camera properties according to the main camera of the main application osg::Camera* billCamera = _billInterface-getCamera(); camera-setViewport(0,0,_billInterface-getGraphicsContext()-getTraits()-width,_billInterface-getGraphicsContext()-getTraits()-height); camera-setProjectionMatrix(billCamera-getProjectionMatrix()); camera-setViewMatrix(billCamera-getViewMatrix()); camera-setReferenceFrame(billCamera-getReferenceFrame()); camera-setStateSet(billCamera-getStateSet()); camera-setRenderOrder(osg::CameraNode::PRE_RENDER,1); camera-setRenderTargetImplementation(osg::CameraNode::FRAME_BUFFER_OBJECT); camera-attach(osg::CameraNode::COLOR_BUFFER, texture.get()); Then I add the scene of the main application to the camera so that it shows this scene. camera-addChild(_billInterface-getScene()); Now I take this camera and put it into the scene graph of the main application: _billInterface-getRootNode()-getParent(0)-addChild(camera); (That way it is on one depth with the rootNode of the scene and the node containing the HUD. I put it in there so that the camera's output is rendered into the texture everytime the viewer of the main application is updated.) I know that I get a texture that way because the quad that holds this texture is normally red and after creating the camera and attaching the texture it turns black. I guess that either some camera properties are wrong or I have to put the camera somewhere else. Can someone give me a hint on what I might be doing wrong? What are the main things I have to take care of when doing render-to-texture? Is the texture automatically updated everytime the scene containing my render-to-texture-camera is rendered? Thanks in advance for any help, Steffen ___ EINE FÜR ALLE: die kostenlose WEB.DE-Plattform für Freunde und Deine Homepage mit eigenem Namen. Jetzt starten! http://unddu.de/[EMAIL PROTECTED] ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] osg::Texture2D and OpenGL textures working together
Hi everybody, I have never used OpenGL and OpenSceneGraph together (and only a little bit of OpenGL alone), so I'm a little unsure of what the best way to solve the following problem is. I want to use an API for interlacing different views on a scene for an autostereoscopic display. The API uses OpenGL and basically takes an vector of GLuints containing the textures with the different views, interlaces them and renders the result into another texture referenced by a GLuint. What I probably have to do is the following: I have to calculate the different views I want to interlace and put them into the vector of GL textures. How do I do this best? Or can I convert GL textures wrapped by OSG as osg::Texture2D into the GL textures? Then I could render my views into osg::Texture2Ds and put them into the vector. And how do I use the texture I get from the API in OSG afterwards? Can I somehow create an osg::Texture2D from it? I would be very happy if someone could give me a hint what the best way to do this would look like or what I should search for to perhaps find some examples since I wasn't very successful until now. Thanks a lot, Steffen Ihre Messenger, Communities und E-Mails jetzt in einem Programm! WEB.DE MultiMessenger http://www.produkte.web.de/messenger/?did=3071 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org