Hello I have a set of pictures, stored in RAM in RGB-32-bits-per-pixel format (allocated via malloc). What is nowadays the most efficient way to "play" this "video" - i.e. a set of frames - in a window, avoiding unnecessary data recopying?
Should i just build CGImage's for every frame that i have in memory, and render that to an ordinary NSImage visual component on ? Or onto NSView? Or, maybe, QTKit's QTMoviePreview? I am planning to use CoreVideo display link to display every new frame. Maybe it is more optimal to use NSOpenGLView (since, perhaps, data would go "directly" to OpenGL driver)? This is a model task, where im trying to understand how to do things :) Right now the only my concern is performance - the faster the better. And what would be the "right way" - that adds the less code 'mess' possible, but which maybe is not that fast? Thanks! _______________________________________________ Cocoa-dev mailing list ([email protected]) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to [email protected]
