Re: [whatwg] Exposing framerate / statistics of video playback and related feedback

2012-11-29 Thread Ian Hickson
On Tue, 1 May 2012, Charles Pritchard wrote:
 
 The list looked at having a (canvas) ctx.stream = mediaElement; option 
 to better copy frames from a media stream into Canvas. I don't think 
 that the assignment operator will work, but it does seem like we could 
 optimize our drawImage calls to only happen when needed. At present, we 
 simply would run requestAnimationFrame. But, if a video stream is 
 operating on a slower frame rate than rAF, then the drawImage + rAF 
 method will be wasteful.
 
 I've suggested an onframeready event; it seems as though that event 
 could also carry the number of dropped frames.

I haven't added this, because copying a video one frame at a time doesn't 
seem like an efficient way to do anything. Some sort of mechanism like 
what I had proposed for WebRTC (not sure if it's still in there) seems 
like a more solid solution, letting the browser do all the heavy lifting 
(potentially off the main thread).

-- 
Ian Hickson   U+1047E)\._.,--,'``.fL
http://ln.hixie.ch/   U+263A/,   _.. \   _\  ;`._ ,.
Things that are impossible just take longer.   `._.-(,_..'--(,_..'`-.;.'


Re: [whatwg] Exposing framerate / statistics of video playback and related feedback

2012-05-01 Thread Hugh Guiney
On Mon, Apr 30, 2012 at 7:37 PM, Ian Hickson i...@hixie.ch wrote:
 On Fri, 28 May 2010, Ian Fette wrote:

 Has any thought been given to exposing such metrics as framerate, how
 many frames are dropped, rebuffering, etc from the video tag?

 It has come up a lot, but the main question is: what is the use case?

Web-based non-linear editors. This software already exists—YouTube has
one: http://www.youtube.com/editor, Mozilla has one:
http://mozillapopcorn.org/popcorn-maker/, and there are/have been
several independent efforts as well
(http://lifehacker.com/5629683/jaycut-is-a-pretty-awesome-web+based-video-editor,
http://www.spacebarstudios411.com/easyclip/, etc).

Right now all of this software is alpha-stage, but the kinds of
problems they attempt to solve involve: pop-up annotations,
synchronized slide shows, clickable video areas, etc. Essentially,
they will allow users to author rich multimedia experiences that
aren't achievable with a traditional desktop NLE. Even if desktop NLEs
were to offer this functionality with an HTML export like Adobe is
doing with Flash CS6, it is advantageous to work in the destination
medium rather than one fundamentally different; a similar trend is
happening right now as web designers are moving away from Photoshop
and beginning to design in the browser directly, and I can only
imagine the same will happen with moving images, technology
permitting.

As it stands, frame rate awareness is a feature of NLEs that you would
have to try very hard NOT to find. It is quite common for modern
camcorders to offer an array of different available frame rates, for
instance Panasonic's prosumer models (HVX200, HPX170 etc.) offer at
least 20 different fps options: 12, 15, 18, 20, 21, 22, 24, 25, 26,
27, 28, 30, 32, 34, 36, 40, 44, 48, 54, and 60. One of the primary
purposes these options is to allow the user to achieve time distortion
effects; if your main timeline is 24fps, you could shoot at 12fps and
play it back for fast-motion; alternatively 48fps for slow-motion.
These are called undercranking and overcranking respectively and
have been in use since the dawn of film.

A ubiquitous UI paradigm in modern video editing is to have a timeline
with a set frame rate, that videos of alternate frame rates can be
dragged into to change their effective playback speed. Not only is
this useful for artistic time distortion effects, but also pragmatic
time distortion, such as mixing 24fps (US film) and 30fps (US
broadcast), 24fps with 25fps (European film), etc. with a non-variable
output frame rate.

Other use cases:

- Categorizing/filtering a video catalog by frame rate, such as on a
stock videography or film archive site, to only see those match the
user's interest.

- Video player UI displaying the frame rate so that users can tell if
it is worthwhile to attempt playback on a slow connection, or device
with limited playback capabilities. For instance such a user might
discern that watching a 1080p60 video on a mobile device would take up
too much bandwidth BEFORE pressing play and having the video stutter
or play too slowly. Similarly, devices could detect this on their own
and report to the user.

- Frame-accurate subtitle authoring; timing the display of text with a
character's lip movements is a precise art and if it is off by even a
few seconds, it is distracting to the audience.

- NLE that ingests Edit Decision List (EDL) XML files, which denote
cuts, transitions, etc. in SMPTE timecode, so editors can work on
projects that were originally cut in another NLE. This would be
especially useful for desktop-to-web migration.

 If you have fixed frame rates, it's trivial to do the conversion to and
 from SMTPE timecode in JavaScript; you don't need any direct support from
 the media element API.

Yes, but we currently have no way of knowing what fixed frame rate we
are working with, making this kind of conversion impossible except
through pure guesswork. If frame rate is exposed we don't need SMPTE
internally.


Re: [whatwg] Exposing framerate / statistics of video playback and related feedback

2012-05-01 Thread Charles Pritchard

On 5/1/12 10:21 AM, Hugh Guiney wrote:

  If you have fixed frame rates, it's trivial to do the conversion to and
  from SMTPE timecode in JavaScript; you don't need any direct support from
  the media element API.

Yes, but we currently have no way of knowing what fixed frame rate we
are working with, making this kind of conversion impossible except
through pure guesswork. If frame rate is exposed we don't need SMPTE
internally.


The frame rate issue seems to have come up with media capture as well.

The list looked at having a (canvas) ctx.stream = mediaElement; option 
to better copy frames from a media stream into Canvas. I don't think 
that the assignment operator will work, but it does seem like we could 
optimize our drawImage calls to only happen when needed. At present, we 
simply would run requestAnimationFrame. But, if a video stream is 
operating on a slower frame rate than rAF, then the drawImage + rAF 
method will be wasteful.


I've suggested an onframeready event; it seems as though that event 
could also carry the number of dropped frames.


All that said, I'm a little behind on the media tags, so apologies if 
I've missed some already existing mechanisms.


-Charles



[whatwg] Exposing framerate / statistics of video playback and related feedback

2012-04-30 Thread Ian Hickson

There was a lot of e-mail on this topic, but a stark lack of descriptions 
of actual end-user use cases for these features, as will be clear in the 
responses I give below.

A quick reminder therefore that in adding features to HTML the first thing 
we want to look for is the problem that we are trying to solve. Without 
that, we don't know how to evaluate the proposed solutions! See this FAQ:

   
http://wiki.whatwg.org/wiki/FAQ#Is_there_a_process_for_adding_new_features_to_a_specification.3F


On Fri, 28 May 2010, Ian Fette wrote:

 Has any thought been given to exposing such metrics as framerate, how 
 many frames are dropped, rebuffering, etc from the video tag?

It has come up a lot, but the main question is: what is the use case?


 This is interesting for things not just like benchmarking,

Could you elaborate on this? Benchmarking what, by whom, and why?


 but for a site to determine if it is not working well for clients and 
 should instead e.g. switch down to a lower bitrate video.

If the problem is making sure the user can stream a resource with a 
bitrate such that the user can play the stream in real time as it is 
downloading, then the better solution seems to be a rate-negotiating media 
protocol, not an API to expose the frame rate. The frame rate may have 
nothing at all to do with the bandwidth: for example, if the user has a 
software decoder, the framerate could be low because the CPU is overloaded 
due to the user running a complicated simulation in another process. 
Similarly, the download rate could be slow because the bandwidth was 
throttled by the user because the user is doing other things and is happy 
to wait for the video to download in the background so that the user can 
watch it later.


On Sun, 30 May 2010, Jeroen Wijering wrote:
 
 For determining whether the user-agent is able to play a video, these 
 are the most interesting properties:
 
  readonly attribute unsigned long bandwidth::
 The current maximum server � client bandwidth, in bits per second.

How is this to be determined? In particular, for example, what should 
happen if the user has the page opened twice, or three times? Is the value 
the same in each tab, or is it reduced accordingly?


  readonly attribute unsigned long droppedframes::
 The number of frames dropped by the user agent since playback of this 
 video was initialized.

What use case does this number address?


On Fri, 2 Jul 2010, Jeroen Wijering wrote:
 
 The most useful ones are:
 
 *) droppedFrames: it can be used to determine whether the client can play the 
 video without stuttering.

The user can presumably tell if the video can play without stuttering just 
by watching it stutter or not stutter. Why would the page need to know? 
Surely there's nothing the page can do about it -- e.g., the playback 
might just be stuttering because the user agent is intentionally dropping 
every other frame because the video is actually not visible on the screen 
currently, or because the video is being played back at 2x speed.


 *) maxBytesPerSecond: it can be used to determine the bandwidth of the 
 connection.

Only if nobody else is using the connection at the same time. What if two 
pages are both open at the same time and both use this to determine the 
connection speed? They'll start intefeering with each other.


On Fri, 7 Jan 2011, Rob Coenen wrote:
 
 are there any plans on adding frame accuracy and/or SMPTE support to 
 HTML5 video?

Not without a use case. :-)


 As far as I know it's currently impossible to play HTML5 video 
 frame-by-frame, or seek to a SMPTE compliant (frame accurate) time-code. 
 The nearest seek seems to be precise to roughly 1-second (or nearest 
 keyframe perhaps, can't tell).

The API supports seeking to any frame, if you know its precise time 
index.


On Sun, 9 Jan 2011, Rob Coenen wrote:

 I have written a simple test using a H264 video with burned-in timecode 
 (every frame is visually marked with the actual SMPTE timecode) Webkit 
 is unable to seek to the correct timecode using 'currentTime', it's 
 always a whole bunch of frames off from the requested position. I reckon 
 it simply seeks to the nearest keyframe?

That's a limitation of the implementation, not of the specification.


On Tue, 11 Jan 2011, Rob Coenen wrote:

 just a follow up question in relation to SMPTE / frame accurate 
 playback: As far as I can tell there is nothing specified in the HTML5 
 specs that will allow us to determine the actual frame rate (FPS) of a 
 movie? In order to do proper time-code calculations it's essential to 
 know both the video.duration and video.fps - and all I can find in the 
 specs is video.duration, nothing in video.fps

What is the use case?


On Tue, 12 Jan 2011, Rob Coenen wrote:
 
 [...] I'd like the 'virtual' FPS of the WebM file exposed to the 
 webbrowser- similar to how my other utilities report a FPS.

Why?


On Wed, 12 Jan 2011, Dirk-Willem van Gulik wrote:
 
 So that means that SMPTE time