Julien Isorce wrote:

> A typical scenario when running WebKit in an embedded environment like
> a TV or STB is to "punch through" graphics to the video plane.
> 
> At Samsung Research UK we have developed a new <hole> element  that
> acts much like a <canvas> to
> 
> - Expose a rectangular "hole" in a web page.
> - Support a mechanism to retrieve the position and size of the hole
> from JavaScript whenever its dimensions or location change.

Hi Julien!

Your usecase definitely resonates with me, so here you have a couple of
considerations from our (Collabora) experience.

The problem we had to face for a customer was to show a 1440x900 @ 60fps
captured video stream with a custom HTML UI on top (with animations and
the usual stuff) in real time on a rather slow platform.

At first we took the most web-friendly route we could think of, by
simply modifying the out-of-tree Clutter port of WebKit to understand a
special v4l2:// URL scheme for <video> elements and then piping the
video stream to the <video> texture through GL (thus not using any
special hardware overlay).

Unfortunately this was not enough to achieve 60fps and we then deployed
a solution which allowed us to make good use of the hardware overlays
provided by the platform: we basically moved the video out of WebKit and
into its own hardware overlay, put WebKit-Clutter in its own overlay on
top of the video and then enabled its transparent-page support.

This way we just needed to set background:transparent on the <html>
element to see through the page to the video stream.

We then injected a trivial JS API to give pages the chance to control
the position and size of the video overlay: with the help of
requestAnimationFrame() and some transparent placeholder <div> this gave
us the ability to implement fancy zoom-in/out animations synchronized
with visible page elements with surprisingly little code.

It also had the benefit of decoupling the 60fps video stream from the
web engine, which was then set to run at 30fps to better fit the graphic
hardware capabilities, noticeably reducing janking during animations.

In the future I see this usecase better served by having WebKit run as a
Wayland nested compositor, which would allow the main compositor to
transparently decide to use hardware overlays, GL composition or some
other platform-specific mechanism on a frame-per-frame basis.

In the context of the GTK+ port, by using the waylandsink GStreamer
component it would be possible to push video frames directly to the
hardware (potentially from hardware decoders) without any copy.

I hope you'll find this useful!

-- 
Emanuele Aina
www.collabora.com


_______________________________________________
webkit-dev mailing list
webkit-dev@lists.webkit.org
https://lists.webkit.org/mailman/listinfo/webkit-dev

Reply via email to