A little background:
I have a project which takes images from an external system. They arrive as JPEGs in memory from a socket stream (this isn't a straight URL). Now what I'd like to do is provide an output port for that JPEG image in a custom provider patch.

The patch works works, the JPEG frames are being read inline with the calls to the plugin execution method but I need to get those frames into QC itself (ie an image output port).


My "progress":
I've created a QCPlugInOutputImageProvider and I'm now attempting to create the code to render into a memory buffer or to a CGLContext.

So far I've created an NSData with the in-memory JPEG, and now have a CIImage. At a later date I wish to have the option of processing the image data with CIFilters (not yet though).

The problem is that renderToBuffer: requires pixel output but I have a CIImage which is effectively the compressed JPEG and I admit I'm confused and lost where to start with all the image tools.

I know that CIImage is the input into a processing pipeline and then the end result is rendered (just like VTK for example) and the only way I can see this working is to render the CIImage into somewhere but how?

My background in parallel systems (big data sets) makes me feel that any intermediate buffer is not right and it should be possible to render from one to the other form in one go. Is this possible?

Help! Any pointers, links, code fragments or names of classes to look or advice to say I'm going in the right direction at would be gratefully received.

Lost and rather confused,
Nick.

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Quartzcomposer-dev mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to