On Thu, 20 Dec 2012, Jer Noble wrote:
On Dec 17, 2012, at 4:01 PM, Ian Hickson i...@hixie.ch wrote:
Should we add a preciseSeek() method with two arguments that does a
seek using the given rational time?
This method would be more useful if there were a way to retrieve the
media's
On 2012/12/18 9:01, Ian Hickson wrote:
On Tue, 2 Oct 2012, Jer Noble wrote:
The nature of floating point math makes precise frame navigation
difficult, if not impossible. Rob's test is especially hairy, given
that each frame has a timing bound of [startTime, endTime), and his test
On Thu, 20 Dec 2012, Mark Callow wrote:
On 2012/12/18 9:01, Ian Hickson wrote:
On Tue, 2 Oct 2012, Jer Noble wrote:
The nature of floating point math makes precise frame navigation
difficult, if not impossible. Rob's test is especially hairy, given
that each frame has a timing bound of
On 12/20/12 9:54 AM, Ian Hickson wrote:
Everything in the Web platform already uses doubles.
Except WebGL. And Audio API wave tables, sample rates, AudioParams, PCM
data (though thankfully times in Audio API do use doubles). And
graphics libraries used to implement canvas, in many cases...
On 2012/12/21 2:54, Ian Hickson wrote:
On Thu, 20 Dec 2012, Mark Callow wrote:
I draw your attention to Don't Store that in a float
http://randomascii.wordpress.com/2012/02/13/dont-store-that-in-a-float/
and its suggestion to use a double starting at 2^32 to avoid the issue
around
On Fri, 21 Dec 2012, Mark Callow wrote:
On 2012/12/21 2:54, Ian Hickson wrote:
On Thu, 20 Dec 2012, Mark Callow wrote:
I draw your attention to Don't Store that in a float
http://randomascii.wordpress.com/2012/02/13/dont-store-that-in-a-float/
and its suggestion to use a double starting
On Dec 20, 2012, at 7:27 PM, Mark Callow callow.m...@artspark.co.jp wrote:
On 2012/12/21 2:54, Ian Hickson wrote:
On Thu, 20 Dec 2012, Mark Callow wrote:
I draw your attention to Don't Store that in a float
http://randomascii.wordpress.com/2012/02/13/dont-store-that-in-a-float/
and its
On Tue, 2 Oct 2012, Jer Noble wrote:
On Sep 17, 2012, at 12:43 PM, Ian Hickson i...@hixie.ch wrote:
On Mon, 9 Jul 2012, adam k wrote:
i'm aware that crooked framerates (i.e. the notorious 29.97) were not
supported when frame accuracy was implemented. in my tests, 29.97DF
timecodes
On Sep 17, 2012, at 12:43 PM, Ian Hickson i...@hixie.ch wrote:
On Mon, 9 Jul 2012, adam k wrote:
i have a 25fps video, h264, with a burned in timecode. it seems to be
off by 1 frame when i compare the burned in timecode to the calculated
timecode. i'm using rob coenen's test app at
On Wed, Oct 3, 2012 at 6:41 AM, Jer Noble jer.no...@apple.com wrote:
On Sep 17, 2012, at 12:43 PM, Ian Hickson i...@hixie.ch wrote:
On Mon, 9 Jul 2012, adam k wrote:
i have a 25fps video, h264, with a burned in timecode. it seems to be
off by 1 frame when i compare the burned in timecode to
On Thu, 7 Jun 2012, Kit Grose wrote:
On 06/06/2012, at 7:44 AM, Ian Hickson wrote:
On Fri, 13 Jan 2012, Kit Grose wrote:
I'd argue that while we did receive in WebM a common codec it does
not enjoy the sort of universal adoption required to be able to
mandate its support in the spec,
On Thu, 7 Jul 2011, Eric Winkelman wrote:
On Thursday, June 02 Ian Hickson wrote:
On Fri, 18 Mar 2011, Eric Winkelman wrote:
For in-band metadata tracks, there is neither a standard way to
represent the type of metadata in the HTMLTrackElement interface nor
is there a standard way
-Original Message-
From: whatwg-boun...@lists.whatwg.org [mailto:whatwg-
boun...@lists.whatwg.org] On Behalf Of Mark Watson
Sent: Monday, June 20, 2011 2:29 AM
To: Eric Carlson
Cc: Silvia Pfeiffer; whatwg Group; Simon Pieters
Subject: Re: [whatwg] Video feedback
On Jun 9, 2011
On Thursday, June 02 Ian Hickson wrote:
On Fri, 18 Mar 2011, Eric Winkelman wrote:
For in-band metadata tracks, there is neither a standard way to
represent the type of metadata in the HTMLTrackElement interface nor
is there a standard way to represent multiple different types of
On Jun 9, 2011, at 4:32 PM, Eric Carlson wrote:
On Jun 9, 2011, at 12:02 AM, Silvia Pfeiffer wrote:
On Thu, Jun 9, 2011 at 4:34 PM, Simon Pieters sim...@opera.com wrote:
On Thu, 09 Jun 2011 03:47:49 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
For commercial video
On Mon, Jun 20, 2011 at 6:29 PM, Mark Watson wats...@netflix.com wrote:
On Jun 9, 2011, at 4:32 PM, Eric Carlson wrote:
On Jun 9, 2011, at 12:02 AM, Silvia Pfeiffer wrote:
On Thu, Jun 9, 2011 at 4:34 PM, Simon Pieters sim...@opera.com wrote:
On Thu, 09 Jun 2011 03:47:49 +0200, Silvia
On Jun 20, 2011, at 10:42 AM, Silvia Pfeiffer wrote:
On Mon, Jun 20, 2011 at 6:29 PM, Mark Watson
wats...@netflix.commailto:wats...@netflix.com wrote:
On Jun 9, 2011, at 4:32 PM, Eric Carlson wrote:
On Jun 9, 2011, at 12:02 AM, Silvia Pfeiffer wrote:
On Thu, Jun 9, 2011 at 4:34 PM, Simon
On Mon, Jun 20, 2011 at 7:31 PM, Mark Watson wats...@netflix.com wrote:
The TrackList object has an onchanged event, which I assumed would fire when
any of the information in the TrackList changes (e.g. tracks added or
removed). But actually the spec doesn't state when this event fires (as far
On Jun 20, 2011, at 11:52 AM, Silvia Pfeiffer wrote:
On Mon, Jun 20, 2011 at 7:31 PM, Mark Watson wats...@netflix.com wrote:
The TrackList object has an onchanged event, which I assumed would fire
when
any of the information in the TrackList changes (e.g. tracks added or
removed). But
On Tue, Jun 21, 2011 at 12:07 AM, Mark Watson wats...@netflix.com wrote:
On Jun 20, 2011, at 11:52 AM, Silvia Pfeiffer wrote:
On Mon, Jun 20, 2011 at 7:31 PM, Mark Watson wats...@netflix.com wrote:
The TrackList object has an onchanged event, which I assumed would fire
when
any of the
On Jun 20, 2011, at 5:28 PM, Silvia Pfeiffer wrote:
On Tue, Jun 21, 2011 at 12:07 AM, Mark Watson wats...@netflix.com wrote:
On Jun 20, 2011, at 11:52 AM, Silvia Pfeiffer wrote:
On Mon, Jun 20, 2011 at 7:31 PM, Mark Watson wats...@netflix.com wrote:
The TrackList object has an
On Thu, 09 Jun 2011 03:47:49 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
For commercial video providers, the tracks in a live stream change all
the time; this is not limited to audio and video tracks but would
include text tracks as well.
OK, all this indicates to me that we
On Thu, Jun 9, 2011 at 4:34 PM, Simon Pieters sim...@opera.com wrote:
On Thu, 09 Jun 2011 03:47:49 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
For commercial video providers, the tracks in a live stream change all
the time; this is not limited to audio and video tracks but would
On Jun 9, 2011, at 12:02 AM, Silvia Pfeiffer wrote:
On Thu, Jun 9, 2011 at 4:34 PM, Simon Pieters sim...@opera.com wrote:
On Thu, 09 Jun 2011 03:47:49 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
For commercial video providers, the tracks in a live stream change all
the time;
On Wed, 08 Jun 2011 02:46:15 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
On Tue, Jun 7, 2011 at 7:04 PM, Philip Jägenstedt phil...@opera.com
wrote:
On Sat, 04 Jun 2011 03:39:58 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
On Fri, Jun 3, 2011 at 9:28 AM, Ian Hickson
On Wed, Jun 8, 2011 at 6:14 PM, Philip Jägenstedt phil...@opera.com wrote:
On Wed, 08 Jun 2011 02:46:15 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
On Tue, Jun 7, 2011 at 7:04 PM, Philip Jägenstedt phil...@opera.com
wrote:
On Sat, 04 Jun 2011 03:39:58 +0200, Silvia Pfeiffer
On Wed, 08 Jun 2011 12:35:24 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
On Wed, Jun 8, 2011 at 6:14 PM, Philip Jägenstedt phil...@opera.com
wrote:
On Wed, 08 Jun 2011 02:46:15 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
That is all correct. However, because it is
On Wed, Jun 8, 2011 at 9:18 PM, Philip Jägenstedt phil...@opera.com wrote:
On Wed, 08 Jun 2011 12:35:24 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
On Wed, Jun 8, 2011 at 6:14 PM, Philip Jägenstedt phil...@opera.com
wrote:
On Wed, 08 Jun 2011 02:46:15 +0200, Silvia Pfeiffer
On Wed, 08 Jun 2011 13:38:18 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
On Wed, Jun 8, 2011 at 9:18 PM, Philip Jägenstedt phil...@opera.com
wrote:
On Wed, 08 Jun 2011 12:35:24 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
On Wed, Jun 8, 2011 at 6:14 PM, Philip
On Jun 8, 2011, at 3:35 AM, Silvia Pfeiffer wrote:
Nothing exposed via the current API would change, AFAICT.
Thus, after a change mid-stream to, say, a smaller video width and
height, would the video.videoWidth and video.videoHeight attributes
represent the width and height of the
-Original Message-
From: whatwg-boun...@lists.whatwg.org [mailto:whatwg-
boun...@lists.whatwg.org] On Behalf Of Eric Carlson
Sent: Wednesday, June 08, 2011 9:34 AM
To: Silvia Pfeiffer; Philip Jägenstedt
Cc: whatwg@lists.whatwg.org
Subject: Re: [whatwg] Video feedback
On Jun 8
@lists.whatwg.org
Subject: Re: [whatwg] Video feedback
On Jun 8, 2011, at 3:35 AM, Silvia Pfeiffer wrote:
Nothing exposed via the current API would change, AFAICT.
Thus, after a change mid-stream to, say, a smaller video width and
height, would the video.videoWidth
On Sat, 04 Jun 2011 03:39:58 +0200, Silvia Pfeiffer
silviapfeiff...@gmail.com wrote:
On Fri, Jun 3, 2011 at 9:28 AM, Ian Hickson i...@hixie.ch wrote:
On Thu, 16 Dec 2010, Silvia Pfeiffer wrote:
I do not know how technically the change of stream composition works in
MPEG, but in Ogg we
On Fri, 03 Jun 2011 01:28:45 +0200, Ian Hickson i...@hixie.ch wrote:
On Fri, 22 Oct 2010, Simon Pieters wrote:
Actually it was me, but that's OK :)
There was also some discussion about metadata. Language is sometimes
necessary for the font engine to pick the right glyph.
Could you
I'll be replying to WebVTT related stuff in a separate thread. Here
just feedback on the other stuff.
(Incidentally: why is there details element feedback in here with
video? I don't really understand the connection.)
On Fri, Jun 3, 2011 at 9:28 AM, Ian Hickson i...@hixie.ch wrote:
On Thu, 16
On Thu, Jun 2, 2011 at 7:28 PM, Ian Hickson i...@hixie.ch wrote:
We can add comments pretty easily (e.g. we could say that ! starts a
comment and ends it -- that's already being ignored by the current
parser), if people really need them. But are comments really that useful?
Did SRT have
On Feb 9, 2010, at 9:03 PM, Ian Hickson wrote:
On Sat, 31 Oct 2009, Brian Campbell wrote:
As a multimedia developer, I am wondering about the purpose of the timeupdate
event on media elements.
It's primary use is keeping the UIs updated (specifically the timers and
the scrubber bars).
On Feb 10, 2010, at 8:01 AM, Brian Campbell wrote:
On Feb 9, 2010, at 9:03 PM, Ian Hickson wrote:
On Sat, 31 Oct 2009, Brian Campbell wrote:
At 4 timeupdate events per second, it isn't all that useful. I can
replace it with setInterval, at whatever rate I want, query the time,
and get
On 2/10/10 1:37 PM, Eric Carlson wrote:
Have you actually tried this? Rendering video frames to a canvas and
processing every pixel from script is *extremely* processor intensive, you are
unlikely to get reasonable frame rate.
There's a demo that does just this at
On Feb 10, 2010, at 1:37 PM, Eric Carlson wrote:
On Feb 10, 2010, at 8:01 AM, Brian Campbell wrote:
On Feb 9, 2010, at 9:03 PM, Ian Hickson wrote:
On Sat, 31 Oct 2009, Brian Campbell wrote:
At 4 timeupdate events per second, it isn't all that useful. I can
replace it with
On 2/10/10 2:19 PM, Brian Campbell wrote:
Do browsers fire events for which there are no listeners?
It varies. Gecko, for example, fires image load events not matter what
but only fires mutation events if there are listeners.
-Boris
On Wed, Feb 10, 2010 at 11:29 AM, Boris Zbarsky bzbar...@mit.edu wrote:
On 2/10/10 2:19 PM, Brian Campbell wrote:
Do browsers fire events for which there are no listeners?
It varies. Gecko, for example, fires image load events not matter what but
only fires mutation events if there are
On Thu, Feb 11, 2010 at 8:19 AM, Brian Campbell lam...@continuation.orgwrote:
But no, this isn't something I would consider to be production quality. But
perhaps if the WebGL typed arrays catch on, and start being used in more
places, you might be able to start doing this with reasonable
On Wed, Feb 10, 2010 at 4:37 PM, Robert O'Callahan rob...@ocallahan.org wrote:
On Thu, Feb 11, 2010 at 8:19 AM, Brian Campbell lam...@continuation.org
wrote:
But no, this isn't something I would consider to be production quality.
But perhaps if the WebGL typed arrays catch on, and start being
On Thu, Feb 11, 2010 at 3:01 AM, Brian Campbell lam...@continuation.org wrote:
On Feb 9, 2010, at 9:03 PM, Ian Hickson wrote:
On Sat, 7 Nov 2009, Silvia Pfeiffer wrote:
I use timeupdate to register a callback that will update
captions/subtitles.
That's only a temporary situation, though,
On Wed, 28 Oct 2009, Kit Grose wrote:
I've been working on my first HTML5 frontend, which is using the video
element, and I've run into a part of the spec that I disagree with (and
would like to understand its justification):
Content may be provided inside the video element. User agents
On Thu, 26 Mar 2009, Matthew Gregan wrote:
At 2009-03-25T10:16:32+, Ian Hickson wrote:
On Fri, 13 Mar 2009, Matthew Gregan wrote:
It's possible that neither a 'play' nor 'playing' event will be fired
when a media element that has ended playback is played again. When
first played,
On Wed, 4 Mar 2009, Chris Pearce wrote:
The media element spec says:
If a media element whose |networkState| has the value |NETWORK_EMPTY|
is inserted into a document, the user agent must asynchronously invoke
the media element's resource selection algorithm.
The resource selection
At 2009-03-25T10:16:32+, Ian Hickson wrote:
On Fri, 13 Mar 2009, Matthew Gregan wrote:
It's possible that neither a 'play' nor 'playing' event will be fired
when a media element that has ended playback is played again. When
first played, paused is set to false. When played again,
49 matches
Mail list logo