On Fri, Jun 07, 2002 at 09:42:22AM -0400, Al Tobey wrote:

> Here's the idea: would it be beneficial to convert a texture to a
> low-quality jpeg, then back again to take advantage of some of the
> inherent lossiness?  It, in theory, should reduce the size of the
> texture without effecting the bit depth and shouldn't require a ton of
> code to get working (use libjpeg).  I don't forsee a big problem
> performance wise if the textures are mangled during the loading stage,
> but I'm still learning ...

Wha...?  Let me get this straight.  You're suggesting that when an OpenGL
user calls glTexImage?D with GL_COMPRESSED_* as the internal format to
compress the image as a JPEG.  Then, when the texture is used, decompress
the texture and upload the uncompressed image (since no card that I know of
can work directly with a JPEG as a texture) to the card?

It's an interesting idea, BUT unless you can get help from the card
decompressing the JPEG on upload (perhaps the Radeon iDCT unit could help?)
-or- you come up with some sort of blazing fast, hand-tuned, assembly-coded
JPEG decoder, the performance will sink faster than the Titanic...and will
rot at the bottom of the ocean for just as long. :)

Hmmm...I wonder if the fragment shader units on modern cards could be used
to do a VRAM-to-VRAM decompression of such a texture...hmm...I still think
the performance would be horrible, though.

-- 
Tell that to the Marines!

_______________________________________________________________

Don't miss the 2002 Sprint PCS Application Developer's Conference
August 25-28 in Las Vegas -- http://devcon.sprintpcs.com/adp/index.cfm

_______________________________________________
Dri-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/dri-devel

Reply via email to