Can't find an exclude option that matches only files with  wget :( :( :(

I was going to use something like:

grep 'ftp://' wget.log | cut -f4 -d'/' | wget -m -nH -o wget.log.??
--exclude-??? - ftp://<user>:<pass>@<host>/
cat wget.log.?? >wget.log
rm wget.log.??

Damn... Darn... Et al...

Regards,
 
Raphael Kraus
Software Developer
[EMAIL PROTECTED]
02 8306 0007 Direct Line
02 8306 0077 Sales | 02 8306 0099 Fax
02 8306 0088 Support
02 8306 0055 Administration
1300 13 WILD (9453) National | 1300 88 WILD (9453) Fax

-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On
Behalf Of Raphael Kraus
Sent: Friday, 8 September 2006 5:06 PM
To: Craig Dibble
Cc: [email protected]
Subject: RE: [SLUG] wget remembering previous downloads

Thanks Craig...

It sounds like the clue I've been after... :)

Don't worry - it'll all be scripted. :) 


Regards,
 
Raphael Kraus
Software Developer
[EMAIL PROTECTED]
02 8306 0007 Direct Line
02 8306 0077 Sales | 02 8306 0099 Fax
02 8306 0088 Support
02 8306 0055 Administration
1300 13 WILD (9453) National | 1300 88 WILD (9453) Fax

-----Original Message-----
From: Craig Dibble [mailto:[EMAIL PROTECTED]
Sent: Friday, 8 September 2006 10:19 AM
To: Raphael Kraus
Cc: [email protected]
Subject: Re: [SLUG] wget remembering previous downloads

Raphael Kraus wrote:
> G'day all...
>  
> I'm doing some man'ning to no avail here...
>  
> Is there a way to have wget (downloading via ftp) to remember what it 
> has successfully downloaded, and not to download the same file again -

> even if the file is deleted from disk?
>  
> If not, has anyone else had to face this problem before?

wget -nc will stop it downloading an existing file in the same
directory, but to do what you're suggesting - if you then move or delete
said file you would probably need to do something clever like output to
a logfile (-o <file> initially, then -a <file> to append to the same
file), then pipe that file back in using an 'exclude' to ignore those
files. You'd probably need to wrap that in a script to get it to work
correctly though.

HTH,
Craig
--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html
--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to