Is it no longer possible to use the 16-bit setting of the Render In Image patch? I seem to remember this working on my Radeon X1600- equipped MacBook Pro in the past, but it now doesn't seem to. I get a blank output when I patch the image output to a Billboard and set the control to anything other than '8-Bit' or 'Default' (which I assume also is 8-bit). The RII patch contains a GLSL Shader patch, in this case. Does anyone know of any kind of workaround to get 16- bit output from the RII?
This is a very card-specific region you're working in. Try a non-glsl shader setup inside (just a simple image->billboard) the RII patch, and see if it generates output. X1600 did work with 16bit mode in the past, but I'm not sure of the extent of it. It's never worked with NVidia cards, as far as I've experienced. (making 16bit mode a bad idea for compositions distributed to others). I don't know how well GLSL handles that environment, so maybe that was the problem?
Otherwise, there was an ATI driver update around 10.5.6 (I think?), which fixed some texture/filter corruption issues we had seen. Perhaps this update also removed 16bit support... (I don't personally have an ATI card to experiment on, so I don't know many details...)
-- [ christopher wright ] [email protected] http://kineme.net/
smime.p7s
Description: S/MIME cryptographic signature
_______________________________________________ Do not post admin requests to the list. They will be ignored. Quartzcomposer-dev mailing list ([email protected]) Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com This email sent to [email protected]

