Re: how do setting up the arachne for easy use wget???

2003-08-16 Thread cce . zizkov
On 14 Aug 03 at 14:02, [EMAIL PROTECTED] wrote:

Reply to Sender: mailto:[EMAIL PROTECTED]
Reply to List: mailto:[EMAIL PROTECTED]

Hello All.

please, somebody explain me step to step - how do i setting up the arachne
for easy using wget for dos???

Hi Vladimir, here is how I usually call wget from Arachne:

1. How it works:


You have to mark the URL of your target, then
call dl.dgi (probably with a self defined key)

2. Installation
***

a) add to arachne.cfg

ShiftF5 reload:file:dl.dgi

b) add to mime.cfg:

file/dl.dgi  |@call aradl.bat $i $n $e

c) copy aradl.bat to Arachne directory:


-- BEGIN: aradl.bat --

@echo off
:: 1. check if Arachne is online with %1 (ip)
:: 2. get nameserver from arachne.cfg as %2
:: 3. create a wattcp.cfg file out of an existing
:: ip-up.bat in Arachne directory, nameserver is set
:: - either to the value of %dns% in case that it is
::   included in ip-up.bat (lsppp)
:: - or to the value of %2 in case that it is not 0.0.0.0
:: 4. copy wattcp.cfg to the wget directory (customize)
:: 5. run wget
:: 6. return to Arachne drive and directory
::
set wget=i:\connect\wget
set wattcp.cfg=%wget%
if %1==0.0.0.0 goto error
copy nul wattcp.cfgnul
if not exist IP-UP.bat goto error
call IP-UP.BAT
echo my_ip=%myip%  wattcp.cfg
echo gateway=%remip%  wattcp.cfg
echo netmask=%netmask%  wattcp.cfg
if %dns1%== if not %1==0.0.0.0 set dns1=%1
echo nameserver=%dns1%  wattcp.cfg
if not %dns2%== echo nameserver=%dns2%  wattcp.cfg
if not %peermru%== echo peermru=%peermru%  wattcp.cfg
if %peermru%== echo peermru=1500  wattcp.cfg
echo sockdelay=10  wattcp.cfg
echo HELO=on  wattcp.cfg
echo TZ=-1  wattcp.cfg
set myip=%MYIP%
if not exist wattcp.cfg goto error
copy wattcp.cfg %wget%nul
%wget%\
cd %wget%
cls
echo Ready to download:
type l:\clip.tmp
pause
i:\connect\wget\wget.exe -t 5 -c -N -nH -nd -il:\clip.tmp
pause
goto end
:error
echo Sorry, no connection found!
:end
set %wget%=
set wattcp.cfg=
%3\
cd %3
-- END: aradl.bat --

Note that the script changes twice drive and directory.
Use a newer version of wget (1.82).

simple i need a little bit information about the arachne because i did
russifycaton of her.

BTW thanks for russification. I wonder whether I can find a
new Arachne user among Russians in Prague.

Regards
Christof

___

 Christof Lange [EMAIL PROTECTED]
 Prokopova 4, 130 00 Praha 3, Czech Republic
 phone: (+420) 222 78 06 73 / 222 78 20 02
 http://www.volny.cz/cce.zizkov



Re: how do setting up the arachne for easy use wget???

2003-08-16 Thread Ron Clarke
Hi Vladimir,

On Thu, 14 Aug 2003 14:02:45 +0300, Vladimir [EMAIL PROTECTED] wrote:

 please, somebody explain me step to step - how do i setting up the arachne for
 easy using wget for dos???

   I can tell you what I have done.

1)  I created a directory off my Arachne directory e.g. d:\arac171\wget

2)  I placed the un-zipped DOS WGET files into that directory:
WGET.EXE and WATTCP.CFG are all that is required here, but you may find it
useful to also make yourself a handy reference.
At the DOS prompt, type:   wget -h  wgethelp.txt
WGETHELP.TXT will have all the commandline parameters and a short
explanation of each.

3)  I edited the WATTCP.CFG file - but only the nameservers numbers, the
rest is OK as it was. WGET can use up to 10 different nameservers,
trying each one in turn until successful.

4)  I placed the file WGET.OOK into the d:\arac171\oops directory.

= WGET.OOK 
cd wget
wget -c -r -k -np -i d:\arac171\clip.tmp
d:
cd\arac171
===

Edit this file to match the directory where Arachne
and WGET are to be found.

5)  In Options| Personal Settings  I edited the keyboard shortcuts, so
that (on my setup) Shift+F9 is set to:  d:\arac171\oops\wget.ook

  You can use any other Shift+ combination, or even put a button into any
suitable Arachne gui file, whatever feels most comfortable.

How Does it Work ?
==

When I am online (and only then) I can Right-Click on a file or website link,
which places that URL into CLIP.TMP.   Then I press Shift+F9.

I get a DOS screen, and I can see WGET connect to the URL and retrieve the file
(if it is a single file) or retrieve the web-page (graphics and all) if the
target is an HTM file.  Whatever is retrieved is saved into a directory that
matches the URL, which is created off the WGET directory. For a website, the
directory tree to hold graphics, CSS, multimedia files etc. is also created.

  When all is retrieved, I am returned to Arachne and the website where I
started from.  Once offline, I can go to the WGET directory, then
follow the directory tree to the website I have downloaded, and it
should all work.

  Note: The inclusion of   -c   in the command line tells WGET to resume a
partial download. This allows you to retrieve a large file by installments over
separate online sessions, but doesn't affect a retrieve starting from scratch.

  The parameter   -r   tells WGET to retrieve recursively, in other words to
also get the files linked from a webpage (graphics etc.). If you will
only ever want to download single files, then you can edit out  -r 
from WGET.OOK, along with  -np  and  -k . This also has the effect of saving
the (single) file directly into the WGET directory.

  The parameter   -np   means no parent and avoids retrieving an entire
website all the way back to the domain name.

  The parameter   -k   tells WGET to convert internal links in the
downloaded files so that they will work from your hard disk.

   The parameter   -i   passes whatever is in d:\arac\clip.tmp to WGET as the
URL to be retrieved.

What Doesn't Work
=
  If the link is represented by a graphic, then only the graphic is retrieved,
so choose text links whenever possible.

  If the files attached to a website (graphics, multimedia) are Windows long
file names and the first 8 characters are identical, then WGET will save each
one on top of the previous on, so you end up with only the last such file saved.

  Be wary of extended websites. The default depth for recursive retrieval is 5
deep, and this can fill up smaller hard disks very quickly. If space is
marginal, then include   -l2   (lower case L and 2) in the command line of
WGET.OOK, which will limit the recursive retrieval depth to two levels only.

  IF the retrieval looks like it is not working, or stuff is starting to
come in that looks wrong, Ctrl+c will stop the program in its tracks and
return you to Arachne.

Hope this gets you going.

Regards,
 Ron

Ron Clarke
AUSREG Consultancy http://homepages.valylink.net.au/~ausreg/index.html
Tadpole Tunes  http://tadpole.mytunebook.de/
-- This mail was written by user of The Arachne Browser - http://arachne.cz/