Greetings,

I'm currently trying to migrate an old in-house app (which no longer  
have the source for) to a new Linux box. Previously this app ran on a  
RH6.2 box, and I've noticed some strange behaviour in how this app  
uses Mesa, specifically the glXUseXFont() function. I was seeing some  
some font rendering issues; it looked like characters were getting  
cropped.

I've tracked down the problem -  for some reason this app has it's own  
implementation of Fake_glXUseXFont(). When the API function  
glXUseXFont (in mesa, libGL) is called, this calls the function which  
does the real work (Fake_glXUseXFont) via a function table, and it  
appears that this ends up getting resolved to it's own internal  
Fake_glXUseXFont() symbol; instead of the function inside libGL (Mesa).

Now I've solved this for now by rebuilding Mesa and changing the Fake_  
prefix to MesaImpl_ which stops the libGL function table finding this  
app's symbol, but if possible I'd like to solve this in a way which  
doesn't involve me maintaining patches on top of Mesa forever!

I'd like to understand the rationale for these Fake_* functions - is  
what I have encountered by design (i.e. to allow apps to override the  
default Mesa implementation), or is this behaviour purely accidental,  
and these Fake_* symbols should have a more limited visibility (i.e.  
hidden / internal); so they can never be overridden?

(note this is all pure software rendering; hardware accel is the next  
step!.)

Any advice / info would be most appreciated.


Thanks


Dave Rigby




-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft 
Defy all challenges. Microsoft(R) Visual Studio 2008. 
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
_______________________________________________
Mesa3d-dev mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/mesa3d-dev

Reply via email to