Around 9 o'clock on Sep 15, Sidik Isani wrote:

|>   With 8 bpp pseudocolor, I'm finding that simply starting the
|>   X-Server allocates almost all the color cells in the default
|>   colormap.  There are only about 12 free cells, and most programs
|>   either fail or install a private colormap.  Is there an extension
|>   which may be doing this?

Yes, the Render extension allocates most of your colormap now.  I was 
planning on making the number it consumed configurable but I got 
sidetracked by other work temporarily.  If you build your own X server, 
you can easily hack this number in the source; look at

        programs/Xserver/render/miindex.c:miInitIndexed

What do others think we should do about this?  One possibility would be to 
bend the protocol and match incoming requests to this palette when the 
colormap was otherwise filled; that would eliminate color flashing while 
still providing reasonable color matching.  The protocol is rather vague 
on when this kind of sharing will occur; perhaps some less stringent 
metric could be applied that allowed more distant colors to be used.

The other option is to use StaticColor as your default visual; read-only 
allocations will always succeed.  Is there some reason you can't run your 
display at 16 or 24 bits?

[EMAIL PROTECTED]        XFree86 Core Team              SuSE, Inc.


_______________________________________________
Xpert mailing list
[EMAIL PROTECTED]
http://XFree86.Org/mailman/listinfo/xpert

Reply via email to