Issue 736: Make Cross-Domain caching easier
http://code.google.com/p/gears/issues/detail?id=736
New issue report by [EMAIL PROTECTED]:
Cross-domain caching is very difficult currently, even with cross-domain
workers. I've found this to be a
constant issue I run into; our developers in the forums also run into this,
and Brad says that partners find
this a difficult thing to work with as well.
Here's an example of this issue. Let's say I have a blog that I want to
allow to work x-domain. If any of my
posts have links to images that are x-domain resources then I have 3 options
1) to get that domain to host a worker.js file (unlikely)
2) pull the images onto my site (a pain)
3) use a proxy (now I have to rewrite all of my img url's to include the
proxy URL)
The same argument goes for JS/CSS includes -- quite often they are x-domain.
What if I wanted to contribute code to Mediawiki? I wouldn't be able to
just contribute a JS file that enables
offline and auto-detects + caches images because they might be x-domain. I
would have to write detailed
instructions on where to put worker.js files, or how to setup a correct
file-proxy. This would be a nightmare
to manage in the JS code and it takes a lot more time and know-how to pull
off.
I'd really like to see any improvements where it is easier to cache
x-domain resources. Maybe just allow
image mimetypes to be x-domain cached? Or maybe let x-domain resources be
cached but only serve
them when the original domain requests them?
When working with Gears x-domain resources have taken the most unnecessary
time to solve and have been
the biggest source of frustration. It would be really nice to see some
changes to make this issue easier.
Issue attributes:
Status: New
Owner: ----
Labels: Type-Defect Priority-Medium
--
You received this message because you are listed in the owner
or CC fields of this issue, or because you starred this issue.
You may adjust your issue notification preferences at:
http://code.google.com/hosting/settings