Re: [osg-users] wrapping an opengl texture to an osg texture
Hi Qingjie, I haven't experiment with CUDA so can't comment on the specifics. The best route is probably to see anyone has published an OSG/CUDA integration example. Robert. On 25 September 2015 at 13:04, Qingjie Zhang <305479...@qq.com> wrote: > Hi Robert, > > I'm trying to do something with the "m_frontTex" in CUDA, so I get the > GLuint and write some values in it according to a CUDA_Opengl_interp > example. > > But I found there's no change in the texture after CUDA processing. So I'm > confused if the texture will update automatically when the "GLuint" > changes, if true, there must be something wrong in my CUDA processing. > > Thank you! > > Qingjie. > > robertosfield wrote: > > HI Qingjie, > > > > > > If you are getting the GL texture handle id from an OSG TextureObject > it'll already be associated with a osg::Texture i.e. m_frontTex why not > just reuse this? > > > > > > Robert. > > > > > > On 25 September 2015 at 07:27, Qingjie Zhang < ()> wrote: > > > > > Hi, > > > I have an opengl texture(GLuint), I'd like to wrap it to an > osg::Texture2D. Is there some way to do this? > > > > > > Actually, I got the GLuint in this way: > > > > > > Code: > > > > > > struct MyCameraPostDrawCallback : public osg::Camera::DrawCallback > > > { > > > virtual void operator()(osg::RenderInfo ) const > > > { > > > int contextID = renderInfo.getContextID(); > > > GLuint handle= > m_frontTex->getTextureObject(contextID)->id(); > > > > > > } > > > } > > > > > > > > > > > > > > > I've googled and searched in this forum, finding maybe a way to do > this by: > > > > > > Code: > > > > > > osg::Texture2D *_texture = new osg::Texture2D; > > > osg::Texture::TextureObject *textureObject = new > osg::Texture::TextureObject(_texture, handle, GL_TEXTURE_2D); > > > textureObject->setAllocated(); > > > _texture->setTextureObject(renderInfo.getContextID(), > textureObject); > > > osg::State *state = renderInfo.getState(); > > > unsigned int _textureStage = 0; > > > > > > state->setActiveTextureUnit(_textureStage); > > > _texture->apply(*state); > > > state->haveAppliedTextureAttribute(_textureStage, > _texture); > > > > > > > > > > > > But I don't know what the "_textureStage" should be, I tried "0", but > "state->setActiveTextureUnit(_textureStage);" returns false. > > > > > > Is this the right way to achieve my goal? If not, how should I do? > > > ... > > > > > > Thank you! > > > > > > Cheers, > > > Qingjie > > > > > > -- > > > Read this topic online here: > > > http://forum.openscenegraph.org/viewtopic.php?p=65205#65205 ( > http://forum.openscenegraph.org/viewtopic.php?p=65205#65205) > > > > > > > > > > > > > > > > > > ___ > > > osg-users mailing list > > > () > > > > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > (http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > ) > > > > > > > > > -- > > Post generated by Mail2Forum > > > -- > Read this topic online here: > http://forum.openscenegraph.org/viewtopic.php?p=65215#65215 > > > > > > ___ > osg-users mailing list > osg-users@lists.openscenegraph.org > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] Oculus+OSG+PostProcess
Hi Björn and all, I'd like to test an application on the Oculus with some post-process effect. On the Oculus side, the only recommendation I've read ( https://developer.oculus.com/documentation/intro-vr/latest/concepts/bp_intro/ ) is to apply the post effect to both eyes independently (taking into account their z-depth). On the osg integration side, I see these points to be covered: 1. setup a scene with some post-processing cameras which is compatible with the 2-slaves cameras setup done by the OculusViewer 2. the post effect should probably affect only the color buffer copied to the Oculus, while the depth buffer (used for the time warp) should be the one written by the main render camera with the actual 3d scene values. I'm not sure which would be the best scheme to achieve that. In particular, it would be good to keep the current slave cameras setup with respect to projection and view matrices, but move the buffer management to the latest stage camera of the post-processing. What do you think? I'd like to help in coding/testing a solution. Ricky ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Oculus+OSG
Riccardo Corsi wrote: > thanks Björn for the MSAA pointer - it was not the cause of the issue but the > explanation on the 75Hz helped me solve the problem: > I had to remove the V-sync setting from the driver configuration. > Ok, I guess you had your V-sync setting to "Force on". Otherwise this should be handled with code inside the osgoculusviewer. Riccardo Corsi wrote: > So to keep a smooth experience the osg application must be able to render at > 75Hz for both eyes - is that correct? > This is an important tip to keep in mind... To cite "VR Design : Best Practices": "PERFORMANCE, i.e. FRAME RATE, IS KEY the difference between 75fps and 30fps is night and day… you MUST deliver 75 fps at a minimum don’t ship until you hit this bar this isn’t an average, its a floor : so target 100fps average this isn’t a luxury, its a requirement." Source: http://dsky9.com/rift/vr-design-best-practices/ Best regards Björn -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=65214#65214 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] INVALID_OPERATION with compressed textures with mipmaps in OSG 3.4.0
Scott, your fixes did not fix the issues on the OSX with the DDS textures ... But thanks anyway On Thu, Sep 24, 2015 at 9:26 PM, Robert Osfieldwrote: > Hi Scott, > > Thanks for the follow up. > > Could you post the whole modified file, this way we can avoid any possible > copy and past errors. > > Thanks, > Robert. > > On 24 September 2015 at 20:13, Davis, Timothy S CTR comnavairsyscom < > timothy.s.davis@navy.mil> wrote: > >> Sorry. It keeps trying to encrypt the message. >> >> Robert >> >> While continuing to build a model that results in the problem, I >> discovered something I didn't see before. The IVE model had an incorrect >> number of mipmap levels (it had 8) for a 32x32 texture. It makes sense >> that glTexStorage2D would generate INVALID_OPERATION in this case. >> Rebuilding the model from a source with uncompressed textures and >> recompressing the textures worked. The original model was converted with a >> much older version of OSG, pre OSG 3 for sure. >> >> So I was barking up the wrong tree:) >> >> That addresses my specific issue without needing a change to OSG 3.4.0. >> However, I still think it is worth trying for the OSX case. >> >> >> Trajce >> >> In osg/Texture.cpp, function applyTexImage2D(), find the line: >> >> useTexStorage &= sizedInternalFormat != 0; >> >> add the following after the line: >> >> if ( useTexStorage && compressed_image && numMipmapLevels > 2 ) >> { >> numMipmapLevels -= 2; >> } >> >> This is clearly not production quality as it assumes block size is 4 and >> complete mipmaps to 1x1. It should be enough to check the approach. You >> may have to set GL_TEXTURE_MAX_LEVEL if the driver thinks the texture is >> incomplete, but I didn't have that issue. >> >> Scott >> ___ >> osg-users mailing list >> osg-users@lists.openscenegraph.org >> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org >> > > > ___ > osg-users mailing list > osg-users@lists.openscenegraph.org > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > > -- trajce nikolov nick ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Modern GLSL and OSG
Hi Jan, Many thanks for the additional information. :) On 25/09/15 17:48, Jan Ciger wrote: On Fri, Sep 25, 2015 at 2:01 AM, Garth Dwrote: Thankyou for the suggestion. :) I have to admit being unsure how to set the specific context at present. From what I've read, there'd need to be a glXCreateContextAttribsARB call somewhere to create the GL3 context under Linux- and there doesn't seem to be one. Skimming the source (I'm presently using OSG 3.2.1, but had a peek at 3.4.0) suggests that perhaps this support is only available on a Windows build. On the cmake side, OSG_GL3_AVAILABLE in my build is off. However, glVersion=4.4 appears in my log during a build. The cmake variables are meant to be set by the developer, see the instructions here: http://permalink.gmane.org/gmane.comp.graphics.openscenegraph.user/72253 It is possible that GL3 support is still available on Windows only, but even if that is the case, it shouldn't be terribly difficult to make it work in Linux too. The drivers and OpenGL implementations certainly support it now. Thanks. I thought this might be the case. I'm beginning to suspect that aiming for strict OpenGL 3+ or 4+ might not be the best thing for me to concentrate at the moment. I might be better off just using it where it helps and leaving a jump to "pure" modern GL until a later date. Well, it depends on what you want to do. Moving to the "modern" OpenGL without the fixed pipeline functionality is a big jump, because basically you don't have anything pre-defined by the library anymore. No matrix stack (glPush/glPop), everything needs to be managed using shaders, etc. The older functionality is easier to make display something on the screen quickly, because there are fewer things to set up. The newer stuff gives you much more flexibility, though. Excellent, many thanks for the explanation and clarification. My original motivation was to move to using "modern" OpenGL (ie. avoiding anything deprecated) because I didn't want to be in the situation where I started relying on something that was likely to be removed in the future. Additionally, I have a need that I know is not going to be handled very well in the fixed-function pipeline. From experience I know that whilst the FFP is neat for simpler tasks, it feels like a massive burden as the needs get more complex. A last goal was as a learning exercise. The aim was to rely on shader functionality a bit more strictly than minimally necessary, as a means of taking care of the glaring (and increasingly hard-to-explain professionally) hole in my 3D knowledge. As I've been digging around the first goal seems less and less important, so my goals have been shifting somewhat. Going pure non-FFP is something I can handle down the track. > Using OSG > helps to hide this complexity, because OSG abstracts and manages many > of these things for you, regardless of which OpenGL profile are you > using. Excellent. I couldn't tell if this was the case, or if OpenGL 3+ was an experimental thing in OSG, or how complete the OSG support for it was. At this point I've made it over the initial hurdle of setting things up properly in OSG and getting the basics running, so I'm on to the meat of the task at this point- messing about with the shaders themselves. Thanks for taking the time to write that up, it has clarified things considerably. Much appreciated. :) Cheers, Garth ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
[osg-users] wrapping an opengl texture to an osg texture
Hi, I have an opengl texture(GLuint), I'd like to wrap it to an osg::Texture2D. Is there some way to do this? Actually, I got the GLuint in this way: Code: struct MyCameraPostDrawCallback : public osg::Camera::DrawCallback { virtual void operator()(osg::RenderInfo ) const { int contextID = renderInfo.getContextID(); GLuint handle= m_frontTex->getTextureObject(contextID)->id(); } } I've googled and searched in this forum, finding maybe a way to do this by: Code: osg::Texture2D *_texture = new osg::Texture2D; osg::Texture::TextureObject *textureObject = new osg::Texture::TextureObject(_texture, handle, GL_TEXTURE_2D); textureObject->setAllocated(); _texture->setTextureObject(renderInfo.getContextID(), textureObject); osg::State *state = renderInfo.getState(); unsigned int _textureStage = 0; state->setActiveTextureUnit(_textureStage); _texture->apply(*state); state->haveAppliedTextureAttribute(_textureStage, _texture); But I don't know what the "_textureStage" should be, I tried "0", but "state->setActiveTextureUnit(_textureStage);" returns false. Is this the right way to achieve my goal? If not, how should I do? ... Thank you! Cheers, Qingjie -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=65205#65205 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Oculus+OSG
Hi Björn, first of all thank you for keeping up to date the Oculus integration! I have just updated to SDK 0.7 + osgOculus head and the viewer works, but: 1. while the provided Oculus demos are really smooth with respect to motion, the osg viewer example results quite jerky, especially the rotation of the head causes a sort of "hiccup" during the motion 2. when using the osg viewer example, the right eye monitor has some flickering every second or so, a part of the screen becomes black/corrupted for some frames. Have you ever noticed such behaviors? Have you got any suggestion to fix them? Thank you, Ricky On Wed, Sep 9, 2015 at 1:46 PM, Björn Blissingwrote: > Hi, > > I just fixed a performance related bug. > > Due to misuse of an enum I had accidentally disabled all culling. (This > bug would probably been avoided if strongly typed enum a'la C++11 were > used). > > This bug only affects users using Oculus SDK 0.6 and 0.7. I urge all users > of these versions of OsgOculusViewer to update to the head revision on > GitHub. > > Best regards, > Björn > > -- > Read this topic online here: > http://forum.openscenegraph.org/viewtopic.php?p=65077#65077 > > > > > > ___ > osg-users mailing list > osg-users@lists.openscenegraph.org > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Oculus+OSG
Hi Ricky, We have recently added support for MSAA in the OsgOculus integration (credits to Chris Denham). Rendering with MSAA enabled will require some extra GPU horse power and is probably the reason you see the stuttering in the rendering. If you look at the performance HUD (by pressing '2' on the keyboard), you will see what frame rate the compositor is working with. Anything under 75fps will cause stutter (on the DK2) and the performance HUD will report by incrementing the value: "Compositor Missed V-Sync Count". To disable the MSAA change this line to a zero (or you could try to lower the amount to 2 and see if that helps): https://github.com/bjornblissing/osgoculusviewer/blob/d0c425d3eda01b8134518ef524906e736a6aed9b/src/viewerexample.cpp#L50 This is probably also related to the flickering you are seeing. Best regards Björn -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=65210#65210 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] wrapping an opengl texture to an osg texture
HI Qingjie, If you are getting the GL texture handle id from an OSG TextureObject it'll already be associated with a osg::Texture i.e. m_frontTex why not just reuse this? Robert. On 25 September 2015 at 07:27, Qingjie Zhang <305479...@qq.com> wrote: > Hi, > I have an opengl texture(GLuint), I'd like to wrap it to an > osg::Texture2D. Is there some way to do this? > > Actually, I got the GLuint in this way: > > Code: > > struct MyCameraPostDrawCallback : public osg::Camera::DrawCallback > { > virtual void operator()(osg::RenderInfo ) const > { > int contextID = renderInfo.getContextID(); > GLuint handle= > m_frontTex->getTextureObject(contextID)->id(); > > } > } > > > > > I've googled and searched in this forum, finding maybe a way to do this by: > > Code: > > osg::Texture2D *_texture = new osg::Texture2D; > osg::Texture::TextureObject *textureObject = new > osg::Texture::TextureObject(_texture, handle, GL_TEXTURE_2D); > textureObject->setAllocated(); > _texture->setTextureObject(renderInfo.getContextID(), > textureObject); > osg::State *state = renderInfo.getState(); > unsigned int _textureStage = 0; > > state->setActiveTextureUnit(_textureStage); > _texture->apply(*state); > state->haveAppliedTextureAttribute(_textureStage, > _texture); > > > > But I don't know what the "_textureStage" should be, I tried "0", but > "state->setActiveTextureUnit(_textureStage);" returns false. > > Is this the right way to achieve my goal? If not, how should I do? > ... > > Thank you! > > Cheers, > Qingjie > > -- > Read this topic online here: > http://forum.openscenegraph.org/viewtopic.php?p=65205#65205 > > > > > > ___ > osg-users mailing list > osg-users@lists.openscenegraph.org > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Modern GLSL and OSG
On Fri, Sep 25, 2015 at 2:01 AM, Garth Dwrote: > Thankyou for the suggestion. :) > > I have to admit being unsure how to set the specific context at present. > > From what I've read, there'd need to be a glXCreateContextAttribsARB call > somewhere to create the GL3 context under Linux- and there doesn't seem to > be one. Skimming the source (I'm presently using OSG 3.2.1, but had a peek > at 3.4.0) suggests that perhaps this support is only available on a Windows > build. On the cmake side, OSG_GL3_AVAILABLE in my build is off. However, > glVersion=4.4 appears in my log during a build. The cmake variables are meant to be set by the developer, see the instructions here: http://permalink.gmane.org/gmane.comp.graphics.openscenegraph.user/72253 It is possible that GL3 support is still available on Windows only, but even if that is the case, it shouldn't be terribly difficult to make it work in Linux too. The drivers and OpenGL implementations certainly support it now. > I'm beginning to suspect that aiming for strict OpenGL 3+ or 4+ might not be > the best thing for me to concentrate at the moment. I might be better off > just using it where it helps and leaving a jump to "pure" modern GL until a > later date. Well, it depends on what you want to do. Moving to the "modern" OpenGL without the fixed pipeline functionality is a big jump, because basically you don't have anything pre-defined by the library anymore. No matrix stack (glPush/glPop), everything needs to be managed using shaders, etc. The older functionality is easier to make display something on the screen quickly, because there are fewer things to set up. The newer stuff gives you much more flexibility, though. Using OSG helps to hide this complexity, because OSG abstracts and manages many of these things for you, regardless of which OpenGL profile are you using. J. ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Oculus+OSG
hmm.. make sure a) SLI is disabled (if you have two or more cards linked) b) you're running the latest nVidia developer drivers from here https://developer.nvidia.com/gameworks-vr-driver-support I have no such artifacts. Running the OSG 3.2 branch here. 2015-09-25 10:52 GMT+02:00 Riccardo Corsi: > Hi Björn, > > first of all thank you for keeping up to date the Oculus integration! > > I have just updated to SDK 0.7 + osgOculus head and the viewer works, but: > > 1. while the provided Oculus demos are really smooth with respect to > motion, > the osg viewer example results quite jerky, especially the rotation of the > head causes a sort of "hiccup" > during the motion > > 2. when using the osg viewer example, the right eye monitor has some > flickering every second or so, > a part of the screen becomes black/corrupted for some frames. > > Have you ever noticed such behaviors? > Have you got any suggestion to fix them? > > Thank you, > Ricky > > On Wed, Sep 9, 2015 at 1:46 PM, Björn Blissing > wrote: > >> Hi, >> >> I just fixed a performance related bug. >> >> Due to misuse of an enum I had accidentally disabled all culling. (This >> bug would probably been avoided if strongly typed enum a'la C++11 were >> used). >> >> This bug only affects users using Oculus SDK 0.6 and 0.7. I urge all >> users of these versions of OsgOculusViewer to update to the head revision >> on GitHub. >> >> Best regards, >> Björn >> >> -- >> Read this topic online here: >> http://forum.openscenegraph.org/viewtopic.php?p=65077#65077 >> >> >> >> >> >> ___ >> osg-users mailing list >> osg-users@lists.openscenegraph.org >> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org >> > > > ___ > osg-users mailing list > osg-users@lists.openscenegraph.org > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > > ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] wrapping an opengl texture to an osg texture
Hi Robert, I'm trying to do something with the "m_frontTex" in CUDA, so I get the GLuint and write some values in it according to a CUDA_Opengl_interp example. But I found there's no change in the texture after CUDA processing. So I'm confused if the texture will update automatically when the "GLuint" changes, if true, there must be something wrong in my CUDA processing. Thank you! Qingjie. robertosfield wrote: > HI Qingjie, > > > If you are getting the GL texture handle id from an OSG TextureObject it'll > already be associated with a osg::Texture i.e. m_frontTex why not just reuse > this? > > > Robert. > > > On 25 September 2015 at 07:27, Qingjie Zhang < ()> wrote: > > > Hi, > > I have an opengl texture(GLuint), I'd like to wrap it to an osg::Texture2D. > > Is there some way to do this? > > > > Actually, I got the GLuint in this way: > > > > Code: > > > > struct MyCameraPostDrawCallback : public osg::Camera::DrawCallback > > { > > virtual void operator()(osg::RenderInfo ) const > > { > > int contextID = renderInfo.getContextID(); > > GLuint handle= > > m_frontTex->getTextureObject(contextID)->id(); > > > > } > > } > > > > > > > > > > I've googled and searched in this forum, finding maybe a way to do this by: > > > > Code: > > > > osg::Texture2D *_texture = new osg::Texture2D; > > osg::Texture::TextureObject *textureObject = new > > osg::Texture::TextureObject(_texture, handle, GL_TEXTURE_2D); > > textureObject->setAllocated(); > > _texture->setTextureObject(renderInfo.getContextID(), > > textureObject); > > osg::State *state = renderInfo.getState(); > > unsigned int _textureStage = 0; > > > > state->setActiveTextureUnit(_textureStage); > > _texture->apply(*state); > > state->haveAppliedTextureAttribute(_textureStage, _texture); > > > > > > > > But I don't know what the "_textureStage" should be, I tried "0", but > > "state->setActiveTextureUnit(_textureStage);" returns false. > > > > Is this the right way to achieve my goal? If not, how should I do? > > ... > > > > Thank you! > > > > Cheers, > > Qingjie > > > > -- > > Read this topic online here: > > http://forum.openscenegraph.org/viewtopic.php?p=65205#65205 > > (http://forum.openscenegraph.org/viewtopic.php?p=65205#65205) > > > > > > > > > > > > ___ > > osg-users mailing list > > () > > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > > (http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org) > > > > > -- > Post generated by Mail2Forum -- Read this topic online here: http://forum.openscenegraph.org/viewtopic.php?p=65215#65215 ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
Re: [osg-users] Oculus+OSG
Hi all, thanks Björn for the MSAA pointer - it was not the cause of the issue but the explanation on the 75Hz helped me solve the problem: I had to remove the V-sync setting from the driver configuration. Works like a charm now :) So to keep a smooth experience the osg application must be able to render at 75Hz for both eyes - is that correct? This is an important tip to keep in mind... I also have some questions about post-process but I'll switch to a different thread. Thanks again! Ricky On Fri, Sep 25, 2015 at 11:09 AM, Björn Blissingwrote: > Hi Ricky, > > We have recently added support for MSAA in the OsgOculus integration > (credits to Chris Denham). Rendering with MSAA enabled will require some > extra GPU horse power and is probably the reason you see the stuttering in > the rendering. If you look at the performance HUD (by pressing '2' on the > keyboard), you will see what frame rate the compositor is working with. > Anything under 75fps will cause stutter (on the DK2) and the performance > HUD will report by incrementing the value: "Compositor Missed V-Sync Count". > > To disable the MSAA change this line to a zero (or you could try to lower > the amount to 2 and see if that helps): > > https://github.com/bjornblissing/osgoculusviewer/blob/d0c425d3eda01b8134518ef524906e736a6aed9b/src/viewerexample.cpp#L50 > > This is probably also related to the flickering you are seeing. > > Best regards > Björn > > -- > Read this topic online here: > http://forum.openscenegraph.org/viewtopic.php?p=65210#65210 > > > > > > ___ > osg-users mailing list > osg-users@lists.openscenegraph.org > http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org > ___ osg-users mailing list osg-users@lists.openscenegraph.org http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org