Sorry, in my desire to shorten the email, I appear to have skipped
something important. Here are the snippets of my X log that led me
to believe I'm in 32bpp:
(II) Setting vga for screen 0.
(==) ATI(0): Chipset: "ati".
(**) ATI(0): Depth 24, (--) framebuffer bpp 32
.
.
.
(==) ATI(0): RGB weight 888
(==) ATI(0): Default visual is TrueColor
.
.
.
(--) Depth 24 pixmap format is 32 bpp
Of course, as someone else has pointed out, why does Mesa necessarily
have to use the same pixel format (internally) as X does? Is it
designed to avoid having to do format conversion per frame? Or did
that requirement come up as part of the merge with the X code base?
-'f
On Tue, Apr 23, 2002 at 11:15:02AM -0700, Mark Vojkovich wrote:
> On Mon, 22 Apr 2002, Geoffrey Broadwell wrote:
> > [snip]
> > Anyone have any clue what's going on? As far as I can interpret the tea
> > leaves, it looks like I'm really running with 32-bit pixels, so I don't
> > see a problem there . . . .
>
> I see only xdpyinfo and glxinfo attached. There's nothing in those
> to specify whether or not you are in 24bpp or 32bpp. Why do you think
> you are in 32bpp?
>
>
> Mark.
_______________________________________________
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert