We're assuming that it is all static content, okay?
I mean to say, you are not logging into the webserver, right?
-just spidering...

   Ben

PS - wget has some nice options for adapting the files for local
browsing (ie, making links relative) and other neato features that have
been discussed here before.  I think it even has options to cloak or
alter its user-agent definition... in case AOL's blacklisting wget(wtf?)

On Fri, 27 Jun 2003 15:04:34 -0700
Ken Barber <[EMAIL PROTECTED]> wrote:

| Hi,
| 
| I'm trying to clone a website (with the owner's permission) and 
| wget isn't working correctly (could it have something to do with 
| the fact that it's hosted on AOL? hmmm).
| 
| I seem to recall the existence of another tool (other than wget) 
| to copy web sites but I cannot for the life of me remember its 
| name.
| 
| Any suggestions?
| 
| Thanx,
| 
| Ken
| _______________________________________________
| EuG-LUG mailing list
| [EMAIL PROTECTED]
| http://mailman.efn.org/cgi-bin/listinfo/eug-lug


-- 
_______________________________________________
EuG-LUG mailing list
[EMAIL PROTECTED]
http://mailman.efn.org/cgi-bin/listinfo/eug-lug

Reply via email to