I've written a QC patch which takes a texture representation, creates a CIImage 
from it, applies a series of CIFilters to it, then draws the resulting CIImage 
to another texture and sets it as the output image.

This all works fine, but is tremendously slow when pushing the video patch 
output through it.


I've sampled QC with Instruments. A significant amount of time is being spent 
in `vt_Copy_yuvsITU709_32ARGB_vec` 
[http://emberapp.com/keith_duncan/images/screen-shot-2010-11-25-at-14-34-09].

My code for getting the input image as a texture is as follows:

> CGColorSpaceRef inputImageColorSpace = ([inputImage imageColorSpace] ? : 
> [context colorSpace]);
> 
> if (![inputImage lockTextureRepresentationWithColorSpace:inputImageColorSpace 
> forBounds:[inputImage imageBounds]]) {
>       [context logMessage:@"%@ couldn't 
> lockTextureRepresentationWithColorSpace:forBounds:", inputImage];
>       return NO;
> }

Which explains the colorspace conversion, as  the `[inputImage 
imageColorSpace]` is RGB (this is also documented in the header).


How can I avoid the costly conversion? I tried passing NULL as the colorspace 
to try and hint that the colorspace shouldn't be converted, but then 
`-lockTextureRepresentationWithColorSpace:…` returns NO.


Is there a more efficient way to pass video through Quartz Composer? I wrote a 
QC patch to take advantage of the capture and drawing patches though I could 
always go back and write everything from scratch.


Thanks,
Keith

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list      (Quartzcomposer-dev@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to