[osg-users] Problem reading unclamped float values and non-standard internal formats PBO

2019-10-14 Thread Omar Álvarez
Hi,

I am trying to render to texture and read the results with an osg::Image.
When I use standard internal formats like GL_RGB it works ok. The problem
appears when I try to for example use GL_FLOAT with GL_RGB16, etc.

When I get my image back the internal format is not what I want (the one
that I set up in the texture) and I see errors like:

error pixelFormat = 805b
ContextData::incrementContextIDUsageCount(0) to 2
Warning: detected OpenGL error 'invalid value' at after
stateset.compileGLObjects in GLObjectsVisitor::apply(osg::StateSet&
stateset)

Here is my PBO setup code:

osg::Camera::RenderTargetImplementation renderTargetImplementation
= renderTargetImplementation = osg::Camera::PIXEL_BUFFER;

_camera->setName(label);
// viewport set in configure() method
_camera->setClearColor(osg::Vec4(0., 1., 1., 1.));
_camera->setClearMask(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
_camera->setReferenceFrame(osg::Transform::ABSOLUTE_RF);
_camera->setRenderOrder(osg::Camera::POST_RENDER);
_camera->setRenderTargetImplementation(renderTargetImplementation);
_camera->setViewport(new osg::Viewport(0,0,viewport_size.x(),
viewport_size.y()));
_camera->getOrCreateStateSet()->setAttribute(new
osg::ClampColor(GL_FALSE, GL_FALSE, GL_FALSE),
osg::StateAttribute::ON | osg::StateAttribute::OVERRIDE |
osg::StateAttribute::PROTECTED);

// Render to texture
_texture = new osg::Texture2D();
_texture->setTextureSize(viewport_size.x(), viewport_size.y());
_texture->setInternalFormat(GL_RGB16);
_texture->setSourceType(GL_FLOAT);
_texture->setSourceFormat(GL_RGB);
_texture->setFilter(osg::Texture::MIN_FILTER,osg::Texture::NEAREST);
_texture->setFilter(osg::Texture::MAG_FILTER,osg::Texture::NEAREST);
_texture->setWrap(osg::Texture::WRAP_S,osg::Texture::CLAMP_TO_EDGE);
_texture->setWrap(osg::Texture::WRAP_T,osg::Texture::CLAMP_TO_EDGE);
_texture->setWrap(osg::Texture::WRAP_R,osg::Texture::CLAMP_TO_EDGE);

// attach the texture and use it as the color buffer.
_camera->attach(osg::Camera::COLOR_BUFFER, _texture);

I am also having trouble with clamped values, when it does work, values are
clamped in the range [0,1] although I have disabled clamping.

Anyone knows how to setup a PBO to read unclamped values in non-standard
formats (GL_RGBUI16, GL_RGB16F...)?

Is this supported in OSG?

Cheers,

Omar.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] How to properly compile osg with EGL support?

2019-02-13 Thread Omar Álvarez
Could you open an issue so we don't pollute this discussion? We will find
out what is going on in Github.

El lun., 11 feb. 2019 a las 9:24, Roy Lichtenheldt ()
escribió:

> Hi Omar,
>
> I tried to run your example, but didn't succeed with compilation. I guess
> I am missing a step in compilation of OSG. Do you have any additional hints
> on that?
>
> I also adapted our particle simulation (still based on OSG 3.4) to use an
> EGL context similiar to how you did it, everything compiles fine, but
> segFaults during the GL calls. (compiled OSG with added -lEGL, however it
> does not seem to make a difference)
>
> Cheers,
> Roy
>
> --
> Read this topic online here:
> http://forum.openscenegraph.org/viewtopic.php?p=75617#75617
>
>
>
>
>
> ___
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] How to properly compile osg with EGL support?

2019-01-25 Thread Omar Álvarez
I have created a small repo with my changes based on your code. I will try
to update the readme today, but compiling it is pretty straightforward.

https://github.com/omaralvarez/osgEGL

I still need to clean up the code and see what is going on with the
warnings, but it looks like it is working properly with the OSG master
branch.

El jue., 24 ene. 2019 a las 23:49, Trajce Nikolov NICK (<
trajce.nikolov.n...@gmail.com>) escribió:

> Hi,
>
> here is how far we have got (not too far but I think it is in the right
> direction). Let see if you have some
>
> On Thu, Jan 24, 2019 at 12:06 AM Omar Álvarez 
> wrote:
>
>> Hi,
>>
>> Sounds good. Tomorrow morning I will compile it. I will probably also
>> make some changes in how GL is used in CMake since to make EGL work you
>> need to link to GLVND GL. Thanks for the tips to get OSG to compile.
>>
>> If you have some code already I will gladly take a look and see if I can
>> make it work. I’ll be happy to share my findings with everybody. If
>> everything goes ok we should have a working example and maybe a tutorial.
>> Do you have a github repo?
>>
>> Regards,
>>
>> Omar.
>> ___
>> osg-users mailing list
>> osg-users@lists.openscenegraph.org
>> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>>
>
>
> --
> trajce nikolov nick
> ___
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] How to properly compile osg with EGL support?

2019-01-24 Thread Omar Álvarez
I have managed to compile and run a simple EGL example that Trajce Nikolov
kindly provided. I am preparing a github repo with the code, if he has no
issue with this. Here are the caveats that I have found:

- When installing driver we need to reboot in order for the EGL context to
be available. In Ubuntu 18.04 installing the latest drivers from NVIDIA PPA
worked out of the box.
- eglinfo is awesome. NVIDIA device EGL needs to be working to be able to
run EGL.
- When using the latest CMake (3.13.x), I think OSG does not properly
detect libOpenGL.so which is needed for EGL. I will test this further to be
sure that is the cause.
- One needs to stablish viewport resolution manually or the following error
pops up: "Error: cannot draw stage due to undefined viewport"
- Some warning pops up that I have not been able to fix: "void
StateSet::setGlobalDefaults() ShaderPipeline disabled."

Any idea why the warning message is popping up?

It would probably be interesting to integrate in OSG the graphics context
code, so that users don't need to write the custom class. If anyone has any
suggestions I would love to hear them.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] How to properly compile osg with EGL support?

2019-01-24 Thread Omar Álvarez
I have managed to compile and run a simple EGL example that Trajce Nikolov
kindly provided. I am preparing a github repo with the code, if he has no
issue with this. Here are the caveats that I have found:

- When installing driver we need to reboot in order for the EGL context to
be available. In Ubuntu 18.04 installing the latest drivers from NVIDIA PPA
worked out of the box.
- eglinfo is awesome. NVIDIA device EGL needs to be working to be able to
run EGL.
- When using the latest CMake (3.13.x), I think OSG does not properly
detect libOpenGL.so which is needed for EGL. I will test this further to be
sure that is the cause.
- One needs to stablish viewport resolution manually or the following error
pops up: "Error: cannot draw stage due to undefined viewport"
- Some warning pops up that I have not been able to fix: "void
StateSet::setGlobalDefaults() ShaderPipeline disabled."

Any idea why the warning message is popping up?

It would probably be interesting to integrate in OSG the graphics context
code, so that users don't need to write the custom class. If anyone has any
suggestions I would love to hear them.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] How to properly compile osg with EGL support?

2019-01-23 Thread Omar Álvarez
Hi,

Sounds good. Tomorrow morning I will compile it. I will probably also make
some changes in how GL is used in CMake since to make EGL work you need to
link to GLVND GL. Thanks for the tips to get OSG to compile.

If you have some code already I will gladly take a look and see if I can
make it work. I’ll be happy to share my findings with everybody. If
everything goes ok we should have a working example and maybe a tutorial.
Do you have a github repo?

Regards,

Omar.
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] How to properly compile osg with EGL support?

2019-01-23 Thread Omar Álvarez
Hi.

I am trying to compile OSG with EGL support in order to be able to run OSG
server side without an X display server. I have installed EGL headers and
the latest and the NVIDIA driver. EGL is detected by OSG and I am
requesting GLVND GL:

cmake -DOpenGL_GL_PREFERENCE=GLVND .

I am getting linking errors:

[ 33%] Linking CXX executable ../../bin/osgversion
../../lib/libosg.so.3.6.3: undefined reference to
`glDrawArrays'../../lib/libosg.so.3.6.3: undefined reference to
`glCallList'../../lib/libosg.so.3.6.3: undefined reference to
`glPointSize'../../lib/libosg.so.3.6.3: undefined reference to
`glTexParameteriv'
../../lib/libosg.so.3.6.3: undefined reference to
`glFrontFace'../../lib/libosg.so.3.6.3: undefined reference to
`glLightModeli'../../lib/libosg.so.3.6.3: undefined reference to
`glGetBooleanv'.

Is this the proper way for compiling OSG with EGL support?

I am using CMake 3.13, Ubuntu 18.04 and gcc 7.3. I think the way in which
the OpenGL library is linked has to change (to support the new CMake flag
OpenGL_GL_PREFERENCE).
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] getWindowingSystemInterface() fails on Ubuntu 18.04

2018-09-05 Thread Omar Álvarez
Sorry, I meant the samples that use it. You're right.

On Wed, Sep 5, 2018, 18:37 Robert Osfield  wrote:

> Hi Omar,
>
> On Wed, 5 Sep 2018 at 17:07, Omar Álvarez  wrote:
> > Thank you very much for your time. If nobody else feels like doing it, I
> can help with updating osg::GraphicsContext::WindowingSystemInterface.
>
> This hasn't looked like an OSG bug too me, just a usage issue.  What
> specifically do you think might need changing with
> WindowingSystemInterface?
>
> Robert.
> ___
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] getWindowingSystemInterface() fails on Ubuntu 18.04

2018-09-05 Thread Omar Álvarez
Hi Robert,

Your answer almost fixed my issue. There is just one
problem  wsi->getScreenSettings(), still queries the wrong session:

1 screen(s) detected
Invalid MIT-MAGIC-COOKIE-1 keyUnable to open display ":0.1".
  Screen #0 : 0x0 0Hz 0 bit

The problem is screen_id(screen), since it is not 0. The proper call would
be:

osg::GraphicsContext::ScreenIdentifier screen_id(main_screen_id.hostName,
main_screen_id.displayNum, screen);

I will update the test repo with the proper code.

Thank you very much for your time. If nobody else feels like doing it, I
can help with updating osg::GraphicsContext::WindowingSystemInterface.

Cheers,

Omar.

2018-09-05 16:50 GMT+02:00 Robert Osfield :

> On Wed, 5 Sep 2018 at 15:22, Julien Valentin 
> wrote:
> > for your code you should replace
> > wsi->getNumScreens()
> > with
> > wsi->getNumScreens(osg::GraphicsContext::ScreenIdentifier(1))
> > to work on DISPLAY=:1.0
>
> Sounds like we are getting to the bottom of things now :-)
>
> FYI, WindowingSystemInterface::getNumScreens() is implemented in
> include/osg/GraphicsContext as:
>
> virtual unsigned int getNumScreens(const ScreenIdentifier&
> screenIdentifier = ScreenIdentifier()) = 0;
>
> The default constructed ScreenIdentifier is:
>
> GraphicsContext::ScreenIdentifier::ScreenIdentifier():
> displayNum(0),
> screenNum(0) {}
>
> Which is fine if the system doesn't change the default DISPLAY from 0.0.
>
> Support for DISPLAY is actually built into ScreenIdentifier via the
> readDISPLAY() method:
>
> /** Read the DISPLAY environmental variable, and set the
> ScreenIdentifier accordingly.
>   * Note, if either of displayNum or screenNum are not
> defined then -1 is set respectively to
>   * signify that this parameter has not been set. When
> parameters are undefined one can call
>   * call setUndefinedScreenDetailsToDefaultScreen() after
> readDISPLAY() to ensure valid values. */
> void readDISPLAY();
>
> To is not called by the constructor though, so you need to call it
> explicitly.  The various Viewer config implementations do actually
> call readDISPLAY:
>
> ~/OpenSceneGraph/src/osgViewer$ grep readDISPLAY */*.cpp
> config/AcrossAllScreens.cpp:si.readDISPLAY();
> config/PanoramicSphericalDisplay.cpp:si.readDISPLAY();
> config/SingleWindow.cpp:traits->readDISPLAY();
> config/SingleWindow.cpp:si.readDISPLAY();
> config/SphericalDisplay.cpp:si.readDISPLAY();
> config/WoWVxDisplay.cpp:si.readDISPLAY();
>
> So I'd suggest using this, such as (modified main.cpp for osgtest:
>
> std::cout << wsi->getNumScreens() << " screen(s) detected" <<
> std::endl;
> for ( unsigned int screen=0 ; screen <
> wsi->getNumScreens(main_screen_id); screen++ )
> {
> osg::GraphicsContext::ScreenIdentifier screen_id(screen);
> osg::GraphicsContext::ScreenSettings screen_settings;
> wsi->getScreenSettings( screen_id, screen_settings );
> std::cout << "  Screen #" << screen << " : "
>   << screen_settings.width << "x" <<
> screen_settings.height << " "
>   << screen_settings.refreshRate << "Hz "
>   << screen_settings.colorDepth << " bit" << std::endl;
> }
>
> I have also attached the full modified file.
>
> Robert.
>
> ___
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
>
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


Re: [osg-users] getWindowingSystemInterface() fails on Ubuntu 18.04

2018-09-05 Thread Omar Álvarez
Hi Robert,

Thanks for the quick response.

I have created an example:

https://github.com/omaralvarez/osgtest

osgviewer runs fine, no errors in console. But if I use INFO, I can see:

Viewer::realize() - No valid contexts found, setting up view across all
screens.

It may be related, but I'm not sure.

None of the examples I've tried have failed.

Cheers,

Omar.

2018-09-05 13:19 GMT+02:00 Robert Osfield :

> Hi Omar,
>
> What happens when you run osgviewer?  Do you get errors output to the
> console?
>
> Do any of the OSG examples fail?
>
> If we can't see the error in standard OSG examples then it may be
> worth creating a small test program so that others can run it on their
> own systems to see if we can establish a pattern and get to the bottom
> of the issue.
>
> Robert.
> ___
> osg-users mailing list
> osg-users@lists.openscenegraph.org
> http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org
>
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org


[osg-users] getWindowingSystemInterface() fails on Ubuntu 18.04

2018-09-05 Thread Omar Álvarez
osg::GraphicsContext::WindowingSystemInterface* wsi =
osg::GraphicsContext::getWindowingSystemInterface();
if ( !wsi ) {
std::cout << "ERROR. Could not access the Windowing System
Interface" << std::endl;
throw -1;
}

std::cout << wsi->getNumScreens() << " screen(s) detected" << std::endl;
for ( unsigned int screen=0 ; screen < wsi->getNumScreens(); screen++ )
{
osg::GraphicsContext::ScreenIdentifier screen_id(screen);
osg::GraphicsContext::ScreenSettings screen_settings;
wsi->getScreenSettings( screen_id, screen_settings );
std::cout << "  Screen #" << screen << " : "
  << screen_settings.width << "x" << screen_settings.height
<< " "
  << screen_settings.refreshRate << "Hz "
  << screen_settings.colorDepth << " bit" << std::endl;
}

This simple code snippet fails for me on Ubuntu 18.04 (OSG 3.6.2 and 3.4.0)
with:

Invalid MIT-MAGIC-COOKIE-1 keyA Unable to open display ":0.0"
Invalid MIT-MAGIC-COOKIE-1 keyA Unable to open display ":0.0"
0 screen(s) detected
Invalid MIT-MAGIC-COOKIE-1 keyA Unable to open display ":0.0"

I have a dedicated NVIDIA GPU with latest drivers (396), running osgviewer
works properly. I also checked that getWindowingSystemInterface is being
called at the correct time. I have tested this after creating a viewer and
it also fails.

RegisterWindowingSystemInterfaceProxy()
X11WindowingSystemInterface()
GraphicsContext::setWindowingSystemInterface() 0x55aada1ee9d0
0x7f69e2fc9978
GraphicsContext::getWindowingSystemInterface() 0x55aada1ee9d0
0x7f69e2fc9978

Another thing that makes no sense is that my X session is not :0.0 but
:1.*, but no matter what $DISPLAY environment variable has, it is ignored,
it is looking for screens in the wrong X session. At this point I don't
know what else to try. Am I doing something wrong? Any ideas?
___
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org