Getting creative and non-standard with HTTP is guaranteed to break something. If it isn't proxies, it will be browser caches, search engines, and it will be very hard to diagnose.
Don't do it. Find a way which is 100% legal. wunder --On March 23, 2006 9:15:26 AM -0800 James M Snell <[EMAIL PROTECTED]> wrote: > > We're serving up our feeds with auth and ssl. > > Julian Reschke wrote: >> James M Snell wrote: >>> Hey, I never said I *wasn't* abusing the headers ;-) Can you say >>> serendipitous hack? ;-) >>> >>> What we're doing may not be 100% per spec, but it doesn't seem to cause >>> any problems either... other than some nominal wierdness in feed readers >>> that support conditional gets but don't keep a feed history (like >>> Firefox Live Bookmarks). In other words, assuming that some offenses >>> are more egregious than others, this is kind of like a rolling stop >>> through a four way intersection in the middle of the desert when there >>> aren't any other vehicles around. >> >> I would expect to see problems as soon as cacheing HTTP proxies are in >> the request path. >> >> Best regards, Julian >> > > -- Walter Underwood Principal Software Architect, Autonomy
