I'm doing my 16-bit RTT work with GL_RGBA16 successfully now, though
osg::Image doesn't like it much (I avoided the problem). However, recently,
I've made some change that knocked me off the fast path onto software
rendering(!) on some older hardware. My GTX 560M is just fine, but a Quadro
FX 2700M laptop is kicking into software rendering (Ayieeee!) and a 9600M
GT also is reported to do the same thing.

  Now, the Quadro is supposed to be a G94GLM core:
http://en.wikipedia.org/wiki/Nvidia_Quadro#Quadro_FX_M

  and this chart:
http://developer.download.nvidia.com/opengl/texture_formats/nv_ogl_texture_formats.pdf

  seems to indicate that the G80, GT200 and presumably later support RGBA16
without compromise.

  Any other suggestions about what could be kicking me off the fast path on
these devices? Would it be usage of some particular GLSL function? Or use
of some texture mode? I wouldn't think allocating too many 16-bit textures
would do it, because I'd think that would just fail to allocate the last
one(s).

  Any pointers to where I could learn what could be kicking me off the
hardware path is appreciated.


  OSG has reported "Rendering in software: pixelFormatIndex 3." on one of
the devices, a GeForce 7300. I'm not clear on what GPU core this is (G73?)
or where it falls on that chart, but empirical testing of earlier versions
of the code seemed to indicate it was delivering 16-it capability fine.


-- 
Chris 'Xenon' Hanson, omo sanza lettere. [email protected]
http://www.alphapixel.com/
Training • Consulting • Contracting
3D • Scene Graphs (Open Scene Graph/OSG) • OpenGL 2 • OpenGL 3 • OpenGL 4 •
GLSL • OpenGL ES 1 • OpenGL ES 2 • OpenCL
Digital Imaging • GIS • GPS • Telemetry • Cryptography • Digital Audio •
LIDAR • Kinect • Embedded • Mobile • iPhone/iPad/iOS • Android
_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to