Re: [whatwg] VIDEO Timeupdate event frequency.
On Fri, 10 Sep 2010, Biju wrote: Matthew Gregan wrote in https://bugzilla.mozilla.org/show_bug.cgi?id=571822 : Firefox fires the timeupdate event once per frame. Safari 5 and Chrome 6 fire every 250ms. Opera 10.50 fires every 200ms. Now in firefox bug 571822 they are changing Firefox fires the timeupdate event at every 250ms But this takes away control of somebody who want to do some image process on every frame, as well as miss frames. So can we have a newFrame event and/or a minTimeupdate property to say what should be the minimum time interval between consecutive timeupdate event. The idea behind timeupdate is just to update the seek bar, and to do so at whatever rate the browser thinks is the best balance between power/CPU usage and user experience. There's currently no way to do anything on a per-frame basis explicitly, but if we were to support that use case, I imagine we'd want to do something more efficient and precise than just firing an event and hoping the script can paint the video before the next frame. That's probably a feature for later, though. Implementations haven't quite gotten the current set of features down reliably yet. :-) On Fri, 10 Sep 2010, Simon Fraser wrote: On Sep 10, 2010, at 10:07 AM, Tab Atkins Jr. wrote: On Fri, Sep 10, 2010 at 9:58 AM, Simon Fraser s...@me.com wrote: In WebKit on Mac, video playback is hardware-accelerated, and the presentation of video frames is disconnected from the web page drawing machinery. A newFrame callback would force us to drop back into software rendering, which is significantly more CPU intensive. I don't support the general use of a 'newFrame' callback except in the context of video processing via canvas. In general, video processing via canvas is going to require dropping into software rendering, right? I think that's what I was hearing from our dudes putting hardware-accelerated video into Chrome. So at least in the case that I can see this often being put towards, you don't lose anything. My concern would be pages registering for newFrame events just to do stuff like updating a controller, which will vastly increase CPU usage. Yeah, if we support this it would make sense to do it in some clever way where you couldn't (easily) abuse it, e.g. have the registered callback be GPU code that doesn't run in the same scripting context (not a concrete proposal, just an idea -- we'd have to wait until WebGL has a proven solution for safely running code on the GPU). On Sat, 11 Sep 2010, Robert O'Callahan wrote: On Sat, Sep 11, 2010 at 11:03 AM, Tab Atkins Jr. jackalm...@gmail.comwrote: So... no newframe event for now, leave timeupdate as it is, and fix this in the future? I think so. Another factor is that a lot of the video effects people have been using canvas for can actually be done with SVG filters, which can be GPU-accelerated and are compatible with asynchronous compositing. So it might be wise to focus on use-cases for video processing that aren't amenable to SVG filters (or extensions thereof), and understand what their requirements are. Indeed. -- Ian Hickson U+1047E)\._.,--,'``.fL http://ln.hixie.ch/ U+263A/, _.. \ _\ ;`._ ,. Things that are impossible just take longer. `._.-(,_..'--(,_..'`-.;.'
Re: [whatwg] VIDEO Timeupdate event frequency.
On 9/11/10 8:56 AM, Roger Hågensen wrote: I can't recall any browsers exposing vsync. (does any?) Gecko is working on it. See http://weblogs.mozillazine.org/roc/archives/2010/08/mozrequestanima.html -Boris
Re: [whatwg] VIDEO Timeupdate event frequency.
On Sat, Sep 11, 2010 at 2:20 PM, Robert O'Callahan rob...@ocallahan.orgwrote: On Sat, Sep 11, 2010 at 11:03 AM, Tab Atkins Jr. jackalm...@gmail.comwrote: On Fri, Sep 10, 2010 at 4:01 PM, Robert O'Callahan rob...@ocallahan.org wrote: I think an ideal API for video frame processing would involve handing video frames to a Worker for processing. Mm, yeah, probably. But then you'd need to be able to do canvas on workers, and hand the data back... This is a complex problem. Most of the usecases I've seen just do get/putImageData, so it might make sense to just provide raw frame data to the Worker and not introduce a canvas dependency. Yes, I think that makes sense, though I would not restrict it to image data, but include audio data (when we have the API for it). Dragging image data through a canvas just to get to the pixels is actually really annoying. If we could set a newFrame event on a video for a worker and the event data contains the video image with associated audio information, that would be the best. So... no newframe event for now, leave timeupdate as it is, and fix this in the future? I think so. Another factor is that a lot of the video effects people have been using canvas for can actually be done with SVG filters, which can be GPU-accelerated and are compatible with asynchronous compositing. So it might be wise to focus on use-cases for video processing that aren't amenable to SVG filters (or extensions thereof), and understand what their requirements are. Things like object segmentation, face recognition, object tracking in video, or anything with frequency analysis in audio come to mind. Workers seem like heaven-made for these anyway, though right now with the canvas indirection it isn't really optimal. Cheers, Silvia.
Re: [whatwg] VIDEO Timeupdate event frequency.
On 2010-09-11 05:23, Eric Carlson wrote: On Sep 10, 2010, at 8:06 PM, Biju wrote: On Fri, Sep 10, 2010 at 7:05 AM, Silvia Pfeiffer silviapfeiff...@gmail.com wrote: Incidentally: What use case did you have in mind, Biju ? I was thinking about applications like https://developer.mozilla.org/samples/video/chroma-key/index.xhtml ( https://developer.mozilla.org/En/Manipulating_video_using_canvas ) Now it is using setTimeout so if processor is fast it will be processing same frame more than on time. Hence wasting system resource, which may affect other running process. Perhaps, but it only burns cycles on those pages instead of burning cycles on *every* page that uses avideo element. If we use timeupdate event we may be missing some frames as timeupdate event is only happen every 200ms or 250ms, ie 4 or 5 frames per second. Even in a browser that fires 'timeupdate' every frame, you *will* miss frames on a heavily loaded machine because the event is fired asynchronously. And we know there are videos which a have more than 5 frames per second. So use a timer if you know that you want update more frequently. Hmm! Once you get up to around 60FPS (1000ms/60=16.6...) you are getting close to 15ms per frame, and unless the OS is running at a smaller timer period that is all the precision you can get. I believe Windows Media Player is using 5ms periods, and the smallest period advisable on a modern Windows system is 2ms, 1ms is most likely not consistently achievable on any typical OS (there will be fluctuations) that is not a real time OS (few end user OS are these days) This would have to be synced to the display refresh rate instead. (no point processing frames that are not displayed/skipped anyway), I can't recall any browsers exposing vsync. (does any?) -- Roger Rescator Hågensen. Freelancer - http://EmSai.net/
[whatwg] VIDEO Timeupdate event frequency.
https://bugzilla.mozilla.org/show_bug.cgi?id=571822 Firefox fires the timeupdate event once per frame. Safari 5 and Chrome 6 fire every 250ms. Opera 10.50 fires every 200ms. Now in firefox bug 571822 they are changing Firefox fires the timeupdate event at every 250ms But this takes away control of somebody who want to do some image process on every frame, as well as miss frames. So can we have a newFrame event and/or a minTimeupdate property to say what should be the minimum time interval between consecutive timeupdate event.
Re: [whatwg] VIDEO Timeupdate event frequency.
On Fri, Sep 10, 2010 at 7:28 PM, Biju bijumaill...@gmail.com wrote: https://bugzilla.mozilla.org/show_bug.cgi?id=571822 Firefox fires the timeupdate event once per frame. Safari 5 and Chrome 6 fire every 250ms. Opera 10.50 fires every 200ms. Now in firefox bug 571822 they are changing Firefox fires the timeupdate event at every 250ms But this takes away control of somebody who want to do some image process on every frame, as well as miss frames. So can we have a newFrame event and/or a minTimeupdate property to say what should be the minimum time interval between consecutive timeupdate event. If we have a newFrame event, might it be an idea to actually hand over the frame data (audio + video) in the event? I would think that only ppl that want to do manipulations on the media data want to have that kind of resolution and it might be more efficient to just provide the data with the event? Incidentally: What use case did you have in mind, Biju? Cheers, Silvia.
Re: [whatwg] VIDEO Timeupdate event frequency.
On Fri, Sep 10, 2010 at 4:05 AM, Silvia Pfeiffer silviapfeiff...@gmail.com wrote: On Fri, Sep 10, 2010 at 7:28 PM, Biju bijumaill...@gmail.com wrote: https://bugzilla.mozilla.org/show_bug.cgi?id=571822 Firefox fires the timeupdate event once per frame. Safari 5 and Chrome 6 fire every 250ms. Opera 10.50 fires every 200ms. Now in firefox bug 571822 they are changing Firefox fires the timeupdate event at every 250ms But this takes away control of somebody who want to do some image process on every frame, as well as miss frames. So can we have a newFrame event and/or a minTimeupdate property to say what should be the minimum time interval between consecutive timeupdate event. If we have a newFrame event, might it be an idea to actually hand over the frame data (audio + video) in the event? I would think that only ppl that want to do manipulations on the media data want to have that kind of resolution and it might be more efficient to just provide the data with the event? That would actually be a rather useful property. I have several examples of video/canvas integration that I show off regularly at talks (and will have an article about on html5doctors.com soon), where I just listen to the play event and start running a function every 20ms, stopping when I see that the video is stopped or paused. Just being able to register the function with a newFrame event instead would be useful in terms of avoiding unnecessary computation, and getting the data directly rather than having to draw the video into a backing canvas and then ask for its ImageData would shave some of the complexity off of the code. How should it return the data? Perhaps the video data as an ImageData object? I don't know how audio would be returned, though. ~TJ
Re: [whatwg] VIDEO Timeupdate event frequency.
If I understand correctly... I think we would be using this a lot in transmedia integration for film/tv. On Fri, Sep 10, 2010 at 10:53 AM, Tab Atkins Jr. jackalm...@gmail.comwrote: On Fri, Sep 10, 2010 at 4:05 AM, Silvia Pfeiffer silviapfeiff...@gmail.com wrote: On Fri, Sep 10, 2010 at 7:28 PM, Biju bijumaill...@gmail.com wrote: https://bugzilla.mozilla.org/show_bug.cgi?id=571822 Firefox fires the timeupdate event once per frame. Safari 5 and Chrome 6 fire every 250ms. Opera 10.50 fires every 200ms. Now in firefox bug 571822 they are changing Firefox fires the timeupdate event at every 250ms But this takes away control of somebody who want to do some image process on every frame, as well as miss frames. So can we have a newFrame event and/or a minTimeupdate property to say what should be the minimum time interval between consecutive timeupdate event. If we have a newFrame event, might it be an idea to actually hand over the frame data (audio + video) in the event? I would think that only ppl that want to do manipulations on the media data want to have that kind of resolution and it might be more efficient to just provide the data with the event? That would actually be a rather useful property. I have several examples of video/canvas integration that I show off regularly at talks (and will have an article about on html5doctors.com soon), where I just listen to the play event and start running a function every 20ms, stopping when I see that the video is stopped or paused. Just being able to register the function with a newFrame event instead would be useful in terms of avoiding unnecessary computation, and getting the data directly rather than having to draw the video into a backing canvas and then ask for its ImageData would shave some of the complexity off of the code. How should it return the data? Perhaps the video data as an ImageData object? I don't know how audio would be returned, though. ~TJ
Re: [whatwg] VIDEO Timeupdate event frequency.
On 9/10/10 10:53 AM, Tab Atkins Jr. wrote: I don't know how audio would be returned, though. Mozilla is using a typed array buffer holding 32-bit floats for its audio data API stuff, I believe. -Boris
Re: [whatwg] VIDEO Timeupdate event frequency.
On Sep 10, 2010, at 7:53 AM, Tab Atkins Jr. wrote: On Fri, Sep 10, 2010 at 4:05 AM, Silvia Pfeiffer silviapfeiff...@gmail.com wrote: On Fri, Sep 10, 2010 at 7:28 PM, Biju bijumaill...@gmail.com wrote: https://bugzilla.mozilla.org/show_bug.cgi?id=571822 Firefox fires the timeupdate event once per frame. Safari 5 and Chrome 6 fire every 250ms. Opera 10.50 fires every 200ms. Now in firefox bug 571822 they are changing Firefox fires the timeupdate event at every 250ms But this takes away control of somebody who want to do some image process on every frame, as well as miss frames. So can we have a newFrame event and/or a minTimeupdate property to say what should be the minimum time interval between consecutive timeupdate event. If we have a newFrame event, might it be an idea to actually hand over the frame data (audio + video) in the event? I would think that only ppl that want to do manipulations on the media data want to have that kind of resolution and it might be more efficient to just provide the data with the event? That would actually be a rather useful property. I have several examples of video/canvas integration that I show off regularly at talks (and will have an article about on html5doctors.com soon), where I just listen to the play event and start running a function every 20ms, stopping when I see that the video is stopped or paused. Just being able to register the function with a newFrame event instead would be useful in terms of avoiding unnecessary computation, and getting the data directly rather than having to draw the video into a backing canvas and then ask for its ImageData would shave some of the complexity off of the code. The problem with a 'newFrame' callback is what to do if the callback takes longer than the duration of a single frame. Does the video engine start dropping frames, or does the video lag? In WebKit on Mac, video playback is hardware-accelerated, and the presentation of video frames is disconnected from the web page drawing machinery. A newFrame callback would force us to drop back into software rendering, which is significantly more CPU intensive. I don't support the general use of a 'newFrame' callback except in the context of video processing via canvas. Simon
Re: [whatwg] VIDEO Timeupdate event frequency.
On Fri, Sep 10, 2010 at 9:58 AM, Simon Fraser s...@me.com wrote: The problem with a 'newFrame' callback is what to do if the callback takes longer than the duration of a single frame. Does the video engine start dropping frames, or does the video lag? Dropping frames would be the better solution, for all the uses I'd put it to. (Or rather, dropping newFrame events.) In WebKit on Mac, video playback is hardware-accelerated, and the presentation of video frames is disconnected from the web page drawing machinery. A newFrame callback would force us to drop back into software rendering, which is significantly more CPU intensive. I don't support the general use of a 'newFrame' callback except in the context of video processing via canvas. In general, video processing via canvas is going to require dropping into software rendering, right? I think that's what I was hearing from our dudes putting hardware-accelerated video into Chrome. So at least in the case that I can see this often being put towards, you don't lose anything. ~TJ
Re: [whatwg] VIDEO Timeupdate event frequency.
On Sep 10, 2010, at 10:07 AM, Tab Atkins Jr. wrote: On Fri, Sep 10, 2010 at 9:58 AM, Simon Fraser s...@me.com wrote: The problem with a 'newFrame' callback is what to do if the callback takes longer than the duration of a single frame. Does the video engine start dropping frames, or does the video lag? Dropping frames would be the better solution, for all the uses I'd put it to. (Or rather, dropping newFrame events.) In WebKit on Mac, video playback is hardware-accelerated, and the presentation of video frames is disconnected from the web page drawing machinery. A newFrame callback would force us to drop back into software rendering, which is significantly more CPU intensive. I don't support the general use of a 'newFrame' callback except in the context of video processing via canvas. In general, video processing via canvas is going to require dropping into software rendering, right? I think that's what I was hearing from our dudes putting hardware-accelerated video into Chrome. So at least in the case that I can see this often being put towards, you don't lose anything. My concern would be pages registering for newFrame events just to do stuff like updating a controller, which will vastly increase CPU usage. Simon
Re: [whatwg] VIDEO Timeupdate event frequency.
I think an ideal API for video frame processing would involve handing video frames to a Worker for processing. Rob -- Now the Bereans were of more noble character than the Thessalonians, for they received the message with great eagerness and examined the Scriptures every day to see if what Paul said was true. [Acts 17:11]
Re: [whatwg] VIDEO Timeupdate event frequency.
On Fri, Sep 10, 2010 at 4:01 PM, Robert O'Callahan rob...@ocallahan.org wrote: I think an ideal API for video frame processing would involve handing video frames to a Worker for processing. Mm, yeah, probably. But then you'd need to be able to do canvas on workers, and hand the data back... This is a complex problem. So... no newframe event for now, leave timeupdate as it is, and fix this in the future? ~TJ
Re: [whatwg] VIDEO Timeupdate event frequency.
On Sat, Sep 11, 2010 at 12:53 AM, Tab Atkins Jr. jackalm...@gmail.comwrote: On Fri, Sep 10, 2010 at 4:05 AM, Silvia Pfeiffer silviapfeiff...@gmail.com wrote: On Fri, Sep 10, 2010 at 7:28 PM, Biju bijumaill...@gmail.com wrote: https://bugzilla.mozilla.org/show_bug.cgi?id=571822 Firefox fires the timeupdate event once per frame. Safari 5 and Chrome 6 fire every 250ms. Opera 10.50 fires every 200ms. Now in firefox bug 571822 they are changing Firefox fires the timeupdate event at every 250ms But this takes away control of somebody who want to do some image process on every frame, as well as miss frames. So can we have a newFrame event and/or a minTimeupdate property to say what should be the minimum time interval between consecutive timeupdate event. If we have a newFrame event, might it be an idea to actually hand over the frame data (audio + video) in the event? I would think that only ppl that want to do manipulations on the media data want to have that kind of resolution and it might be more efficient to just provide the data with the event? That would actually be a rather useful property. I have several examples of video/canvas integration that I show off regularly at talks (and will have an article about on html5doctors.com soon), where I just listen to the play event and start running a function every 20ms, stopping when I see that the video is stopped or paused. Just being able to register the function with a newFrame event instead would be useful in terms of avoiding unnecessary computation, and getting the data directly rather than having to draw the video into a backing canvas and then ask for its ImageData would shave some of the complexity off of the code. I do exactly the same thing in my demos. If browsers are currently not fast enough to be able to process a newFrame event, might it be possible to hook that up with a Web Worker then? Silvia.
Re: [whatwg] VIDEO Timeupdate event frequency.
On Fri, Sep 10, 2010 at 7:05 AM, Silvia Pfeiffer silviapfeiff...@gmail.com wrote: Incidentally: What use case did you have in mind, Biju ? I was thinking about applications like https://developer.mozilla.org/samples/video/chroma-key/index.xhtml ( https://developer.mozilla.org/En/Manipulating_video_using_canvas ) Now it is using setTimeout so if processor is fast it will be processing same frame more than on time. Hence wasting system resource, which may affect other running process. If we use timeupdate event we may be missing some frames as timeupdate event is only happen every 200ms or 250ms, ie 4 or 5 frames per second. And we know there are videos which a have more than 5 frames per second. Another Idea is to make timeupdate event frequency adjustable by Webdeveloper. Thanks Biju
Re: [whatwg] VIDEO Timeupdate event frequency.
On Sep 10, 2010, at 8:06 PM, Biju wrote: On Fri, Sep 10, 2010 at 7:05 AM, Silvia Pfeiffer silviapfeiff...@gmail.com wrote: Incidentally: What use case did you have in mind, Biju ? I was thinking about applications like https://developer.mozilla.org/samples/video/chroma-key/index.xhtml ( https://developer.mozilla.org/En/Manipulating_video_using_canvas ) Now it is using setTimeout so if processor is fast it will be processing same frame more than on time. Hence wasting system resource, which may affect other running process. Perhaps, but it only burns cycles on those pages instead of burning cycles on *every* page that uses a video element. If we use timeupdate event we may be missing some frames as timeupdate event is only happen every 200ms or 250ms, ie 4 or 5 frames per second. Even in a browser that fires 'timeupdate' every frame, you *will* miss frames on a heavily loaded machine because the event is fired asynchronously. And we know there are videos which a have more than 5 frames per second. So use a timer if you know that you want update more frequently. eric
Re: [whatwg] VIDEO Timeupdate event frequency.
On Sat, Sep 11, 2010 at 11:03 AM, Tab Atkins Jr. jackalm...@gmail.comwrote: On Fri, Sep 10, 2010 at 4:01 PM, Robert O'Callahan rob...@ocallahan.org wrote: I think an ideal API for video frame processing would involve handing video frames to a Worker for processing. Mm, yeah, probably. But then you'd need to be able to do canvas on workers, and hand the data back... This is a complex problem. Most of the usecases I've seen just do get/putImageData, so it might make sense to just provide raw frame data to the Worker and not introduce a canvas dependency. So... no newframe event for now, leave timeupdate as it is, and fix this in the future? I think so. Another factor is that a lot of the video effects people have been using canvas for can actually be done with SVG filters, which can be GPU-accelerated and are compatible with asynchronous compositing. So it might be wise to focus on use-cases for video processing that aren't amenable to SVG filters (or extensions thereof), and understand what their requirements are. Rob -- Now the Bereans were of more noble character than the Thessalonians, for they received the message with great eagerness and examined the Scriptures every day to see if what Paul said was true. [Acts 17:11]