Hi Vladimir,

On Thu, 14 Aug 2003 14:02:45 +0300, "Vladimir" <[EMAIL PROTECTED]> wrote:

> please, somebody explain me step to step - how do i setting up the arachne for
> easy using wget for dos???

   I can tell you what I have done.

1)  I created a directory off my Arachne directory e.g. d:\arac171\wget

2)  I placed the un-zipped DOS WGET files into that directory:
WGET.EXE and WATTCP.CFG are all that is required here, but you may find it
useful to also make yourself a handy reference.
At the DOS prompt, type:   wget -h > wgethelp.txt
WGETHELP.TXT will have all the commandline parameters and a short
explanation of each.

3)  I edited the WATTCP.CFG file - but only the nameservers numbers, the
rest is OK as it was. WGET can use up to 10 different nameservers,
trying each one in turn until successful.

4)  I placed the file WGET.OOK into the d:\arac171\oops directory.

===== WGET.OOK ====
cd wget
wget -c -r -k -np -i d:\arac171\clip.tmp
d:
cd\arac171
===================

Edit this file to match the directory where Arachne
and WGET are to be found.

5)  In "Options| Personal Settings"  I edited the keyboard shortcuts, so
that (on my setup) Shift+F9 is set to:  d:\arac171\oops\wget.ook

  You can use any other Shift+ combination, or even put a button into any
suitable Arachne gui file, whatever feels most comfortable.

How Does it Work ?
==================

When I am online (and only then) I can Right-Click on a file or website link,
which places that URL into CLIP.TMP.   Then I press Shift+F9.

I get a DOS screen, and I can see WGET connect to the URL and retrieve the file
(if it is a single file) or retrieve the web-page (graphics and all) if the
target is an HTM file.  Whatever is retrieved is saved into a directory that
matches the URL, which is created off the WGET directory. For a website, the
directory tree to hold graphics, CSS, multimedia files etc. is also created.

  When all is retrieved, I am returned to Arachne and the website where I
started from.  Once offline, I can go to the WGET directory, then
follow the directory tree to the website I have downloaded, and it
should all work.

  Note: The inclusion of "  -c  " in the command line tells WGET to resume a
partial download. This allows you to retrieve a large file by installments over
separate online sessions, but doesn't affect a retrieve starting from scratch.

  The parameter "  -r  " tells WGET to retrieve recursively, in other words to
also get the files linked from a webpage (graphics etc.). If you will
only ever want to download single files, then you can edit out " -r "
from WGET.OOK, along with " -np " and " -k ". This also has the effect of saving
the (single) file directly into the WGET directory.

  The parameter "  -np  " means "no parent" and avoids retrieving an entire
website all the way back to the domain name.

  The parameter "  -k  " tells WGET to convert internal links in the
downloaded files so that they will work from your hard disk.

   The parameter "  -i  " passes whatever is in "d:\arac\clip.tmp to WGET as the
URL to be retrieved.

What Doesn't Work
=================
  If the link is represented by a graphic, then only the graphic is retrieved,
so choose text links whenever possible.

  If the files attached to a website (graphics, multimedia) are Windows long
file names and the first 8 characters are identical, then WGET will save each
one on top of the previous on, so you end up with only the last such file saved.

  Be wary of extended websites. The default depth for recursive retrieval is 5
deep, and this can fill up smaller hard disks very quickly. If space is
marginal, then include "  -l2  " (lower case L and 2) in the command line of
WGET.OOK, which will limit the recursive retrieval depth to two levels only.

  IF the retrieval looks like it is not working, or stuff is starting to
come in that looks wrong, Ctrl+c will stop the program in its tracks and
return you to Arachne.

Hope this gets you going.

Regards,
         Ron

Ron Clarke
AUSREG Consultancy http://homepages.valylink.net.au/~ausreg/index.html
Tadpole Tunes      http://tadpole.mytunebook.de/
-- This mail was written by user of The Arachne Browser - http://arachne.cz/

Reply via email to