It sounds like you're trying to synthesize a texture by blending two paletted textures in-memory before you render. Neat idea, but I'm not sure anyone here (Valve) can help you, Half-Life doesn't do that anywhere. Palettes aren't used during blending, they're only used for the on-disk (and sometimes in-memory) format. Blending in Half-life is always done in screen format, which is either 15, 16, or 24 bit color depending on your hardware or software mode. At that stage, it's never returned to paletted mode.
Since HL doesn't do anything like what you're trying to do, I'm not sure that anyone here has thought through all the issues. If you're calling through tri-api, the first problem you may be hitting is that HL optimizes the palette list, all textures that share the same palette get indexed to the same GL palette entry. One problem you may be having is that your new palette may not actually be getting loaded, or, well, any one of many other possibilities. You may want to ask Alfred Reynolds directly, I think he's done some work with loading textures post initialization. -----Original Message----- From: Skyler York [mailto:[EMAIL PROTECTED] Sent: Monday, March 03, 2003 3:55 PM To: [EMAIL PROTECTED] Subject: [hlcoders] Additive Rendermode [ Converted text/html to text/plain ] Please excuse my last e-mail, Hotmail is a piece of sh*t... >:( Anyway, I am working on a few features/additions for the compile tools and wanted to use Half-Life's additive rendering to blend two textures together. Not literally, of course, but simply use the same equation used in Half-Life to blend two textures together additively. Now, before you say "STFW!!!" I already have, and to my surprise I couldn't find much info. I did happen to read that additive blending was like glBlendFunc(GL_ONE, GL_ONE), however I couldn't decipher the MSDN docs to figure out what the actual equation would be :) One problem is that the texture miptex's in the WAD files are using 8-bit palettes. So while I can do the blending in 24-bit, I have to regenerate a new palette based on the end blend. Luckily, I am able to generate pretty good palettes with NeuQuant, an algorithm which uses Kohenen's self-organizing neural networks. However it isn't perfect. The problem I'm having here is that whenever I test out a new blending algorithm in an attempt to emulate Half-Life's, it's difficult to tell whether or not the error is with the algorithm or with the palette generated. I know a lot of Valve programmers hang out here, so if anyone could lead me in the right direction for the additive blending in Half-Life, that would be much appreciated. I'm pretty sure it's a OpenGL/D3D thing, but I couldn't decipher the docs and the web hasn't been helpful. :) And if anyone knows of a good palette generation algorithm, I'd like to hear about it :) Neural networks are supposed to be pretty dang good, but the code I'm using is from 1994, so who knows :P ---------------------------------------------------------------------------- -- Add photos to your messages with MSN 8. [1] Get 2 months FREE*. ===References:=== 1. http://g.msn.com/8HMBENUS/2749 _______________________________________________ To unsubscribe, edit your list preferences, or view the list archives, please visit: http://list.valvesoftware.com/mailman/listinfo/hlcoders _______________________________________________ To unsubscribe, edit your list preferences, or view the list archives, please visit: http://list.valvesoftware.com/mailman/listinfo/hlcoders

