On Fri, 17 Nov 2000, Bill Janssen wrote:
> sitescooper.pl -dump -mhtml -refresh \
> -site site_samples/linux/slashdot.site -filename Slashdot
>
> and it obligingly goes out and pulls over 18 pages from Slashdot.
> However, when I then look at the generated Slashdot.html top page, it
> has no links in it!
I just had the same problem.
I used your suggestion of -mhtml, as plucker handles large docs
slowly (iSilo is *fast*) and I thought it might be snappier if
I did multiple pages.
It is.
But the -mhtml for the BBC World News site means the URL has to be
more accurate. I think -html will handle an initial redirect, or
something, and -mhtml will not.
sitescooper/site_samples/regional_uk/bbc_news_world.site
changes from
--------------------------------------------------------------
URL: http://news.bbc.co.uk/low/english/world/default.htm
Name: BBC World News
Levels: 2
--------------------------------------------------------------
to
--------------------------------------------------------------
URL: http://news.bbc.co.uk/low/english/world/default.stm
Name: BBC World News
Levels: 2
--------------------------------------------------------------
The first worked with -html, but not with -mhtml - I got the results
you described.
Cheers, Andy!
_______________________________________________
Sitescooper-talk mailing list
[EMAIL PROTECTED]
http://lists.sourceforge.net/mailman/listinfo/sitescooper-talk