Hello

Ive noticed some oddities with the QCImageInputProtocol methods that work with textures, specifically with regards to texture coordinates when using more than one input image source.

I have a plugin I am developing which has two image inputs, one for video input, and one for smaller 256x1 LUT textures, and am attempting to bind them like so within my

- (BOOL) execute:(id<QCPlugInContext>)context atTime: (NSTimeInterval)time withArguments:(NSDictionary*)arguments

method:

        <snip..>
if(image && [image lockTextureRepresentationWithColorSpace:[image imageColorSpace] forBounds:[image imageBounds]] && lut && [lut lockTextureRepresentationWithColorSpace:[lut imageColorSpace] forBounds:[lut imageBounds]])
        {       
                
[image bindTextureRepresentationToCGLContext:[context CGLContextObj] textureUnit:GL_TEXTURE0 normalizeCoordinates:NO]; [lut bindTextureRepresentationToCGLContext:[context CGLContextObj] textureUnit:GL_TEXTURE1 normalizeCoordinates:NO];
                
// Make sure to flush as we use FBOs and the passed OpenGL context may not have a surface attached finalOutput = renderToFBO(cgl_ctx, width, height, bounds, [image textureName],[lut textureName], lutWidth, lutHeight, glslProgramObject, amt);

         <snip..>

Now, I am passing normalizeCoordinates:NO to bindTextureRepresentation. my LUT image is 256 x 1 pixels, and 'image' is some video - esque dimension, like 640x480.

the render to FBO function simply produces an output texture, by rendering to a texture attachment (this code works in my other plugins, and was verified to work on the GL list).

Now, I am seeing some oddities with attempting to work with any texture matrix or coordinate data from texture unit 1, my LUT texture specifically. I dont get any working values from it.

If I use my texture coordinates like so:

lutTextureCoords = vec2(gl_TextureMatrix[1] * gl_MultiTexCoord1); (in GLSL)

(which should have the proper coords, since im binding to GL_TEXTURE1 unit, and not normalizing), I get no coords at all (verified by using a simple texture coord visualizer GLSL shader).

however, If I instead use texure unit 0s coords, I do get a result,

lutTextureCoords = vec2(gl_TextureMatrix[0] * gl_MultiTexCoord0); (in GLSL)

Also, Ive noticed that *BOTH* GL_TEXTURE0 or GL_TEXTURE1s coordinates are actually normalized, even though ive asked them not to be, and I am not touching the GL_TEXTURE_MATRIX manually at all, or loading identity in glMatrixMode(GL_TEXTURE).

As another data point, ive also noticed similar oddities with texture unit 1 in the regular GLSL patch (noted a while ago on the mailing list).

In short, whats the deal with > 1 texture in a custom patch? What should I be expecting coordinate wise?

Im able to work around the issue, by manually altering the texture coords myself, but im curious if someone else has noticed these oddities, or if I am doing anything to cause them.

Thanks,

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to