Ok I've got to the point when the system attempts to create the provider. gdb shows the correct data and data structures are present

The problem I have now is that the provider is throwing an exception in QC itself: Argument pixelFormat does not verify ... (at this point it hits the end of the text box).

I've got it checking endianess and using the constants QCPlugInPixelFormat...8 so I'm assuming this is a problem with the fact that the JPEG (and therefore the bitmap) is 24bit without an alpha channel.

There's no setting for straight RGB8 without the alpha channel.. so I take it I have todo a preprocess and map the colour depth to 32bit instead?

One option is to write a CIImage filter to expand the colour range from 24 to 32 bit using a GPU kernel. The other (slower) option is to effectively create a 32bit bitmap and then copy into it replacing the missing alpha channel using a (i,j) loop?

Nick


On 18 Jul 2008, at 00:05, Christopher Wright wrote:

So far I've created an NSData with the in-memory JPEG, and now have a CIImage. At a later date I wish to have the option of processing the image data with CIFilters (not yet though).

The problem is that renderToBuffer: requires pixel output but I have a CIImage which is effectively the compressed JPEG and I admit I'm confused and lost where to start with all the image tools.

I'm not up to par on the renderToBuffer: method, but you could try creating a CGImage or NSImage (or even an NSBitmapImageRep) from the JPEG data -- CIImage is 100% useless for anything other than CoreImage processing. There are all kinds of intermediate ways to pass data around between the formats (with various pros and pitfalls), but if you've got hard data, just make a raster image for now.

QC's smart enough to convert it to a CIImage if necessary regardless of the way you output it in your plugin. If you're going to do processing inside the plugin, then you may want to keep it a CIImage, and instead create a raster image from the result using something like this:

NSBitmapImageRep *bmp = [[NSBitmapImageRep alloc] initWithCIImage: myCIImage];
NSImage *frame = [[NSImage alloc] init];
[frame addRepresentation: bmp];
[bmp release];

An alternative:

NSCIImageRep *bmp = [[NSCIImageRep alloc] initWithCIImage: myCIImage];
NSImage *frame = [[NSImage alloc] init];
[frame addRepresentation: bmp];
[bmp release];

(the latter one doesn't work for us due to some subtle issues with reading the NSImage later on, but if it works for you, then it's a pretty fast path with no intermediate copies)

My background in parallel systems (big data sets) makes me feel that any intermediate buffer is not right and it should be possible to render from one to the other form in one go. Is this possible?

Unfortunately, it's not _quite_ that simple :) as you've no doubt noticed.

Feel free to keep asking for more information though -- you can learn a lot about the internals of the various image frameworks by mucking about with problems like this :)

--
[ christopher wright ]
[EMAIL PROTECTED]
http://kineme.net/


 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to