On Sep 22, 2012, at 3:18 AM, Zak Nelson <z...@adobe.com> wrote:

> This framework renders HTML content and the user can interact with it. The
> framework only gives you access to the NSView which it renders into. Since
> the content is rendered in the subprocess, which is hidden, I'm trying to
> translate the inputs I get from the parent process into the child process.

This is possible to do, but it’s a _buttload_ of work. Chrome does it (and now 
Safari). It involves setting up shared memory buffers for pixmaps and using a 
lot of IPC calls to route events back and forth. Which sounds straightforward 
at this level, but this level is sort of like “you play the flute by blowing in 
one end and wiggling your fingers over the holes”, to quote Monty Python. I 
worked on Chrome for a year, and saw bits of the code to make this happen, and 
it looked Extremely Gnarly. Especially when you get into stuff like browser 
plugins, and hardware-accelerated graphics that want to render directly to the 
screen not to a pixmap.

If the framework generates HTML, why can’t you capture its HTML and send that 
to your app to display in a WebView? Or run a trivial little HTTP server in the 
framework process and point a WebView at that URL in the app; I’ve written 
several programs that use this approach.)

OK, wait, I just noticed that the framework you want to embed _is_ Chromium. 
The obvious question is, why can’t you just use a WebView instead? Is there 
some kind of functionality you need that only exists in Chromium’s fork of 
WebKit and not Apple’s?

—Jens
_______________________________________________

Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to