On Fri, Jun 27, 2003 at 03:04:34PM -0700, Ken Barber wrote:
> Hi,
> 
> I'm trying to clone a website (with the owner's permission) and 
> wget isn't working correctly (could it have something to do with 
> the fact that it's hosted on AOL? hmmm).
> 
> I seem to recall the existence of another tool (other than wget) 
> to copy web sites but I cannot for the life of me remember its 
> name.
> 
> Any suggestions?
> 
> Thanx,
> 
> Ken

Look at these:

puf
Description: Parallel URL fetcher
 puf is a download tool for UNIX-like systems. You may use it to
 download single files or to mirror entire servers. It is similar to GNU
 wget (and has a partly compatible command line), but has the ability to
 do many downloads in parallel. This is very interesting, if you have a
 high-bandwidth internet connection.

snarf
Description: A command-line URL grabber
 Snarf is a utility retrieve files via the http and
 ftp protocols. It supports http redirect, http and ftp resume, http
 and ftp authentication, and other neat things.  Its functionality
 is similar to that of wget, but with a much smaller binary.

Cory

-- 
Cory Petkovsek                                       Adapting Information
Adaptable IT Consulting                                Technology to your   
(541) 914-8417                                                   business
[EMAIL PROTECTED]                                  www.AdaptableIT.com
_______________________________________________
EuG-LUG mailing list
[EMAIL PROTECTED]
http://mailman.efn.org/cgi-bin/listinfo/eug-lug

Reply via email to