On  6 Jun 02 at 22:58, [EMAIL PROTECTED] wrote:

>Reply to Sender: mailto:[EMAIL PROTECTED]
>Reply to List: mailto:[EMAIL PROTECTED]
>
>Hi folks,
>
>just tested this:
>
>If You want to d/l a big file or complete website from within Arachne
>without using the cache try this methode:
>Add the following line to mime.cfg:
> file/dl.dgi       |@wget -r -i clip.tmp

Really great idea, Joerg. But I would recommend a slightly
different command line:

1. in general I would add:

"-t 3" to give the program three tries

"-c" to resumes an incomplete download

"-nH" to avoid creating subdirctories from the host name


2. I had difficulties with -r (recursive) and -p (embedded images
etc.) when I tried to download a huge file from a ftp site. wget
downloaded some invisible list file and then terminated. So I reduced
the command line to:

  wget -t 3 -c -nH -i clip.tmp


3. In order to download just one HTML page with embedded
images, styles etc. I recommend:

 wget -t 3 -nv -c -nH -k -nr -L -p -i clip.tmp


4. In order to mirror a whole website I have the best experience
with this command line:

 wget -t 3 -nv -c -N -r -x -nH -l inf -k -nr -L -p -np -i clip.tmp


Christof Lange


_______________________________________________

 Christof Lange <[EMAIL PROTECTED]>
 Prokopova 4, 130 00 Praha 3, Czech Republic
 phone: (+420-2) 22 78 06 73 / 22 78 20 02
 http://www.volny.cz/cce.zizkov


Reply via email to