OK it looks like I've got 4 bits each of RGBA, although I'm not sure whether this is a 16bit format or possibly a 24bit format with some wasted space. So, the question is, how would I go about attempting to set the FBO format to simply RGB (no alpha) and hopefully find this is indeed a 24 bit buffer and now I can get all 8 bits for color?
Also, I found a thread which makes the (unreferenced) claim that ATI only supports 16 bits in their FBOs and nVidia only supports the same as the framebuffer. http://www.gamedev.net/community/forums/topic.asp?topic_id=442743 I will try rebooting into OS X and see what's there... > I'm playing now with > "glGetRenderbufferParameterivEXT(GL_RENDERBUFFER_EXT, > GL_RENDERBUFFER_INTERNAL_FORMAT_EXT, ¶ms);" to see if any > information can be gained... _______________________________________________ osg-users mailing list [email protected] http://openscenegraph.net/mailman/listinfo/osg-users http://www.openscenegraph.org/
