On Thu, May 7, 2009 at 2:45 PM, Brian Eaton <bea...@google.com> wrote:

> On Thu, May 7, 2009 at 2:32 PM, John Hjelmstad <fa...@google.com> wrote:
> >> - no-store cache control headers
> >
> > Don't appear to affect this.
>
> Wow.  Weird.  I would have guessed that it would trigger HTTP requests
> on every RPC.  Or are you getting to reuse existing iframes?


Yes, this technique only creates a single IFRAME which is reused ad nauseam
for message-passing. It is resized from 10px -> 20px -> 10px -> 20px et al
after changing only the fragment (so no reload occurs).


>
>
> > ...as would 401s on account of auth popups. This is one reason for
> choosing
> > robots.txt though -- for what purpose is robots.txt but for anyone to
> access
> > it? Arguably a robots.txt hidden behind 401 is a misconfiguration, no?
> > (optimistically)
>
> All of the 301s/302s/401s/403s/cache control stuff is a
> misconfiguration.  Doesn't mean they won't happen.  In particular
> people sometimes stick filters on the root of their web site that do
> things you might not expect, like requiring authentication, or
> redirecting from https to http.  Fixing things done at the root of web
> sites is hard, you end up needing approval from web masters who would
> rather ignore your existence.
>
> What I'm getting at is that as annoying as the RPC relay URL is, we
> should probably still let people configure it themselves.  Maybe
> default to using robots.txt, but let people override it?
>
> You may have already done this in your code, TBH I haven't even looked
> at your implementation.


I haven't impl'd it, but did consider it. I decided to punt for the moment
but I'll put it in shortly, to keep the CL less-unruly than it already is :)

Reply via email to