I've been using QC as a test bed for sequences of CI filters and math operations for image analysis. Basically, I put the image in, pass it through various filters and bits 'n bobs, and some info comes out the other end. Ideally, I'd like to somehow invoke this QC composition from my Cocoa code. Much like a function or method. I don't want to display the composition output anywhere. It's just for analysis. If I could do that, it would be great. I could tweak things easily in QC, then hop back to XCode. I see that you can use compositions in IB and have them displayed in QCView. That's not what I want to do though. Does anyone have any experience with this? Is it a worthwhile approach?
It can be a worthwhile approach, and it allows for some things that a normal CI filter chain can't do.
To do this, look into QCRenderer, rather than QCView -- QCRenderer is the offline version (it isn't tied to the display) -- you can feed it a custom GL context (ala NSOpenGLContext), or have it manage its own offscreen context. You can then render to the context, and read it back (QCRenderer snapshotImageOftype: is the method of choice, I think).
Anyway, there's also Image Filter compositions -- this is a special protocol where a composition doesn't render anything, and you use published outputs to get the resultant image. This is closer conceptually to what you were asking, but a bit different in terms of how it's implemented.
-- [ christopher wright ] [email protected] http://kineme.net/
smime.p7s
Description: S/MIME cryptographic signature
_______________________________________________ Do not post admin requests to the list. They will be ignored. Quartzcomposer-dev mailing list ([email protected]) Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/quartzcomposer-dev/archive%40mail-archive.com This email sent to [email protected]

