Hi Curtis, I have gone through the DDS plugin checking the mipmap_offsets vector that gets passed to the osg::Image to set where the mipmap levels are and it all looks correct. I've put debugging into the Texture.cpp where the data gets download to the GPU and again all the values are as they should be.
I have also modified the mipmap_offsets to produce invalid levels and these errors get mapped onscreen as I would expect them to. This shows that these values are being used and are being used correctly. I have also looked for OSX specifc code paths in Texture.cpp and the only one relevant is the ClientStoreHint related code but as far as I can tell this Hint shouldn't enabled in this case so it looks to me exactly the same code paths should be used under Linux and OSX and with exactly the same data. The OSG code that I've looked at looks fine, the data looks fine, it all seems to work as intended. This leaves me without anything more that I can do at my end. I'll have to pass this on to OSX dev's to get to the bottom of why precomputed compressed mipmaps don't work with OSX, they should, it's a pretty well established part of the spec. I can't rule out an OSG bug somewhere along the line, but at this point I have say the most likely cause of the problem is OSX OpenGL drivers. I would recommend looking into support forums to see if others have had problems with precomputed compressed mipmaps. As for workarounds for bugs drivers, perhaps the best thing to do in this instance would be to look at disabling extensions, or having an OSX specific hack in your own application code to disable the mipmaps on the osg::Image so your OSX application doesn't attempt to pass precomputed mipmaps. Robert. _______________________________________________ osg-users mailing list [email protected] http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

