My OSG App renders to two screens (:0.0 & :0.1) of Linux system. If I use an OcclusionQueryNode as my root, it crashes in this configuration.

First I get these messages:

Warning: detected OpenGL error 'invalid operation' after RenderBin::draw(,)
Warning: detected OpenGL error 'invalid operation'

Then a series of these:

Warning: detected OpenGL error 'invalid operation' at start of State::apply()

Then a crash:

*** glibc detected *** ./app.bin: malloc(): memory corruption (fast): 0x00002aaab0190e66 ***
======= Backtrace: =========
/lib64/libc.so.6[0x3a0f8725e0]
/lib64/libc.so.6(__libc_malloc+0x7d)[0x3a0f872e8d]
/usr/lib64/libGLcore.so.1[0x3488fd8114]

I assume that I need separate instances per GL context to make this work, but I am looking for advice as to how to implement?

Thanks,
Todd

--
Todd J. Furlong
Inv3rsion, LLC
http://www.inv3rsion.com
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to