Glenn,

I don't think so ...  Your premise has the same validity as a statement
that accidents are caused by any red cars, which sometimes "spontaneously
trigger" loss of control and result in crashes; and if they don't
spontaneously cause a crash, an attempt to pull it over to the side of
the road will often cause a runaway and crash.

Not *all* these pages *always* trigger a "runaway"; some I've met *do* 
fit the "always" parameter, but not all.  At least one of the "always"
category is PRAGMA NO CACHE, although Arachne *did* cache it or it
couldn't have been loaded to the screen; a return to the page meant it
would always be downloaded again, however ... despite still being in
cache.

Now the metacrawler page was 'fun' ...  it was not being cached, even
though there is nothing in the META tags that should have prevented it. 
It took me about a half-dozen tries before F-6 showed that the page even
existed; at that point ALT-E shell out allowed me to find the page in
cache.  Same way multiple attempts to get anything from "=" resulted in
failures ... F-4 couldn't work, even if it wanted to, because of long
lines in the code.

And not *all* the pages without a 'size' will run away wildly if one
attempts to abort the download -- 15 attempts at metacrawler and I quit
trying to get clicking X to cause a runaway.  Point of fact, as many
pages as I have "hit the X" on, I have never had it lead to a "runaway"
and I've been running 1.70r3 since the day it became available, and I do
more than a little bit of surfing.

The lack of "informed size" may be a symptom, but I think there's more
to it than that ... and I think 'no cache' or the failure of Arachne to
cache [or at least create headers] could be part of the problem also.

And as far as that goes, this problem with the endless download of files
-- because the page needs more than 256 items cached -- could very well
be linked to the endless count-up of non-existant bytes in a file, the
problem we currently refer to [when being polite] as "runaway download."

l.d.
====


On Sat, 26 May 2001 20:46:49 -0400, Glenn McCorkle wrote:
<snip> my stuff

> Thanks very much for such diligent work.
> However, we already know what type of page causes it.

> Any page which does not send the file size of the HTM, HTML,
> SHTML, ASP, CGI, etc, etc, etc...

> Such pages will sometimes "spontaneously trigger" the bug.
> Others will be fine unless we attempt to abort the D/L before it has
> completed.

> Try aborting http://www.metacrawler.com/ at about 10kb
> (might take several tries to trigger the bug)

-- Arachne V1.70;rev.3, NON-COMMERCIAL copy, http://arachne.cz/

Reply via email to