Hi,

On 02/09/2011 01:19, Frank Sullivan wrote:
Thanks guys, that is very helpful.

I am having a bit of trouble with this, and I was wondering if you might be 
able to help. For some reason, my call to glReadPixels seems to be a no-op. 
That is, none of the memory that my data* points to is actually being 
overwritten by my call to glReadPixels.

I ran the osgscreencapture example using --pbuffer-only and --no-pbo, and that 
seems to be working fine. The debugger reveals that, inside 
osg::Image::readPixels, the image data is indeed being written-to as a result 
of the call to glReadPixels. So, something definitely seems different about my 
program, although I can't seem to detect any appreciable differences in how I'm 
setting up my program vs. how osgscreencapture is set up.

Here is where I set up my viewer:


Code:
_viewer = new osgViewer::Viewer;

// Init the GraphicsContext Traits
osg::ref_ptr<osg::GraphicsContext::Traits>  traits = new 
osg::GraphicsContext::Traits;
traits->x = 0;
traits->y = 0;
traits->width = OSGDLL_FRAME_WIDTH;
traits->height = OSGDLL_FRAME_HEIGHT;
traits->red = 8;
traits->green = 8;
traits->blue = 8;
traits->alpha = 8;
traits->windowDecoration = false;
traits->pbuffer = true;
traits->doubleBuffer = false;
traits->sharedContext = 0;

// Create the GraphicsContext
osg::ref_ptr<osg::GraphicsContext>  gc = 
osg::GraphicsContext::createGraphicsContext(traits);

// Create&  Setup Camera
osg::ref_ptr<osg::Camera>  camera = _viewer->getCamera();
camera->setGraphicsContext(gc);
camera->setClearColor(osg::Vec4(1.0f, 0.0f, 0.0f, 1.0f));
camera->setViewport(new 
osg::Viewport(0,0,OSGDLL_FRAME_WIDTH,OSGDLL_FRAME_HEIGHT));

// Add some scene data
osg::ref_ptr<osg::Node>  shed = osgDB::readNodeFile("Shed2.flt");
_viewer->setSceneData(shed);

// Realize viewer
_viewer->realize();



The main difference here is that I'm not creating a new camera and setting it 
as a slave. I'm just using the main camera that was provided by the viewer. 
Also, I am not double-buffering (I'll explain why in a sec). Here is where I 
actually draw a frame and then try to retrieve it:


Code:
unsigned char imgData[OSGDLL_FRAME_WIDTH*OSGDLL_FRAME_HEIGHT*4];
memset(imgData, 0x00, OSGDLL_FRAME_WIDTH*OSGDLL_FRAME_HEIGHT*4);
_viewer->frame();
glReadBuffer(GL_FRONT);
GLenum readBufferError = glGetError();
glReadPixels(0, 0, OSGDLL_FRAME_WIDTH, OSGDLL_FRAME_HEIGHT, GL_BGRA, 
GL_UNSIGNED_BYTE, imgData);
GLenum readPixelsError = glGetError();



As you can see, I am zeroing out the memory before I call glReadPixels. After I 
make the call, the memory is still zeroed out. I've tried other values as well. 
For instance, if I use 0x0F as my memset value, the array elements remain as 
0x0F even after glReadPixels is called. So, it's not even that the frame buffer 
is empty and I'm getting all zeros or something. Rather, it's just failing to 
overwrite my data with whatever is in the frame buffer.

And yet, I don't get any errors. Both calls to glGetError in the above code 
return zero. Now, if I attempt to set double buffering on, and I pass GL_BACK 
into glReadBuffer, this returns GL_INVALID_OPERATION for reasons I don't 
understand (the back buffer should exist, to my understanding, and I'm not 
making this call between glBegin/glEnd, and those are the only reasons that the 
glReadBuffer man page lists as possible causes for GL_INVALID_OPERATION). So, I 
just turned double buffering off and passed GL_FRONT in instead. This got rid 
of the GL_INVALID_OPERATION error, but unfortunately I'm still not getting 
anything from glReadPixels.

As another test, I tried using osg::Image::readPixels, just in case it was 
doing something important that I am failing to do:


Code:
osg::ref_ptr<osg::Image>  img = new osg::Image;
img->readPixels(0, 0, 256, 256, GL_BGRA, GL_UNSIGNED_BYTE);



The end result was the same: it allocated itself a data buffer, but 
glReadPixels fails to fill it in. The entire buffer is just full of 0xCD's.

Although I am not creating any PBOs to my knowledge, I tried binding 
GL_PIXEL_PACK_BUFFER to 0 just in case. However, this did not seem to help.

So, I'm thinking that there is something else about the OpenGL state that needs 
to be set, which I am missing, in order for this to work. Does anyone, by 
chance, have any ideas?[/code]

The context where the texture lives must be active, so try sticking your readback code into the camera post draw callback.

jp



------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=42371#42371





_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

--
This message is subject to the CSIR's copyright terms and conditions, e-mail legal notice, and implemented Open Document Format (ODF) standard. The full disclaimer details can be found at http://www.csir.co.za/disclaimer.html.

This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean.

_______________________________________________
osg-users mailing list
osg-users@lists.openscenegraph.org
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to