There are multiple interesting ideas being discussed

1. Mapping files to persistent URLs.
2. Sharing persistent URLs between different origins.
3. Using the ServiceWorker [1] to redirect URL requests (and possibly
manage it's own cache / files)
4. De-duping file copies using a Git like scheme.

1. Mapping files to persistent URLs.
======================================================
There are some things that could be considered 'ok' to be slow: images,
video startup, list of unread emails. There are some things that are very
time sensitive such as the initial page layout. This means that things like
the CSS required for layout and the fonts required for layout need to be
there before the body begins laying out; ie: they need to be available in
the <head> section. If they are not then the page will irritatingly flash
as the parts arrive. Unless I'm missing something, this means that objects
retrieved by promises/callbacks are not fast enough.

For these "needed for layout" objects, persistent URLs are very attractive
as they can be retrieved quickly and in the <head> section. These
persistent URLs can be implemented in a filesystem or  IndexedDB or other
mechanism. A ServiceWorker could remap the URL but the data would still
need to be available locally (eg, on disk).


2. Sharing persistent URLs between different origins.
======================================================
Right now, any interesting object that is used by different origins (even
from the same organization) must be downloaded and stored once per origin.
Imagine if Linux required glib to be separately stored for every
executable. This is how the web works today.

Shareable persistent URLs would allow for a single copy to be download and
shared across origins. Like shared libraries the user of the shared object
has to trust the providing origin and only the providing origin should be
able to write the data.


3. Using the ServiceWorker to redirect URL requests (and possibly manage
it's own cache / files)
======================================================
The ServiceWorker provides a way for an web page to redirect URLs. This is
a very attractive feature for applications that are offline (or have an
unreliable connection). The redirected URL could be to a completely
different URL or to data managed by the ServiceWorker itself; eg, the
ServiceWorker could use the FileSystem API to store data and redirect URLs
to that data. Hopefully, this redirection will be fast; eg, fast enough for
'needed for layout' objects.

Each ServiceWorker is origin specific: "they are not shared across domains,
and they are completely isolated from the browser's HTTP cache" [2]. I take
this to imply that the ServiceWorker has no ability to provide persistent
URLs to other origins.


4. De-duping file copies using a Git like scheme.
======================================================
My sense is everyone likes the idea of avoiding storing redundant data and
Git's use of the SHA1 message digest as the filename is 'good enough'. This
is a low security risk mechanism that is good for storage efficiency. The
most benefit occurs when the storage mechanism (eg, FileSystem, IndexedDB)
applies this across origins. Like sharing across origins it gets the
benefit of avoiding duplicates but it does not address the multiple
downloads issue. Multiple downloads are probably okay for smallish files
but could be an issue for larger files such as 20Mbyte Chinese fonts, large
Javascript libraries, etc. My wild guess is that because this is a 'good
thing to do' but not 'a critical thing to do', its odds of getting
implemented are poor.


Brian Stell


Notes:
[1] https://github.com/slightlyoff/ServiceWorker
[2] https://github.com/slightlyoff/ServiceWorker/blob/master/caching.md






On Wed, Nov 6, 2013 at 8:28 AM, pira...@gmail.com <pira...@gmail.com> wrote:

> > That's very interesting and useful, but I don't think it fits the same
> use
> > case I was talking about.  I want the ability to create some object that
> > exports an URL that I can put in an iframe.  Then all requests from that
> > iframe for resources will dynamically call my javascript code.  I could
> > implement the same logic that a server-side application does, but from
> local
> > code in my browser.
> >
> That's just the purpose of ServiceWorker :-) Only that from your
> message, I suspect you are asking about having the same functionality
> but only on the current session or maybe also only when the page is
> open, deleting it on reload. I don't know of anything like to this,
> the most similar ones would be FirefoxOS Browser API or Chrome
> FileSystem API, but nothing as powerful as ServiceWorker, sorry :-(
> They are talking about implementing the Fetch specification, maybe you
> would write them about allowing to be used someway the ServiceWorker
> functionality on a per-session basis, I find legitimate your
> proposition...
>
>
>
> --
> "Si quieres viajar alrededor del mundo y ser invitado a hablar en un
> monton de sitios diferentes, simplemente escribe un sistema operativo
> Unix."
> – Linus Tordvals, creador del sistema operativo Linux
>
>

Reply via email to