For an important but infrequently used program based on FLTK, I have suddenly
found that it is able to obtain only 5-bit color buffers (5+5+5 instead of
8+8+8), causing my application to mis-perform.  (I read back the frame buffer,
so the bitdepth is important to the proper function of the program.)

I don't believe I rebuilt the application recently, but I did upgrade my Macs
from 10.6 (where I know the program worked) to 10.6.4 (in steps of course, but
I didn't check the program at each step), and now it no longer works.  During
this progression of upgrades, I did not re-compile the program.  If I now
re-compile, the program still has the problem.  Any suggestions?

For a bit more information, I am currently using Mac 10.6.4, and fink's package
for FLTK 1.1.7.  The program is compiled for 32-bit.  My graphics card is an
NVIDIA 8800GT.

To try getting to the bottom of the problem, I tried invoking
fl_window->mode(FL_RGB8), although I don't have to, and I check
fl_window->can_do()), which returns true.  However, I then ask OpenGL for the
depth of the returned window using glGetIntegerv(GL_RED_BITS,&redbits), it says
5 bits.  It should say 8.

Remember that this code has been working for years, and I believe the bug
cropped up without me recompiling the program.  Thus, I suspect it has arisen
from one of the Mac upgrades, possibly in GLX, or X, or Quartz, or in the
NVIDIA driver.  Any suggestions?

-Marc Levoy, Stanford University

_______________________________________________
fltk-bugs mailing list
[email protected]
http://lists.easysw.com/mailman/listinfo/fltk-bugs

Reply via email to