TED]
Sent: Sunday, July 13, 2014 4:36 AM
To: wget@sunsite.dk
Subject: Annyoing behaviour with --input-file
If wget is used with --input-file option, it gets directory listing for each
file specified in input file (if ftp protocol) before downloading each file,
which is quite annyoying if there are fe
"Adam Klobukowski" <[EMAIL PROTECTED]> writes:
>> "Adam Klobukowski" <[EMAIL PROTECTED]> writes:
>>
>> > If wget is used with --input-file option, it gets directory
>> > listing for each file specified in input file (if ftp protocol)
>> > before downloading each file,
>>
>> This is not specific
Fred Holmes <[EMAIL PROTECTED]> writes:
> And I tried -nc on downloading only new files from ftp.eps.gov. While
> it worked, the comparison is very slow, a significant fraction of a
> second to compare each file. With over 700 files to compare and
> refuse, it takes a long time to perform the co
At 06:30 PM 11/25/2003, Hrvoje Niksic wrote:
Are you using --timestamping (-N)? If so, can you do without it, or
replace it with --no-clobber?
But then you will only download new files, not newer files? But I want the
newer files (updated virus definition files from ftp.f-prot.com).
And I tried
"Adam Klobukowski" <[EMAIL PROTECTED]> writes:
> If wget is used with --input-file option, it gets directory listing
> for each file specified in input file (if ftp protocol) before
> downloading each file,
This is not specific to --input-file, it happens when --timestamping
is specified.
Are yo
I pointed this out about a year ago. As I recall, the response I got back
then was that fixing it is "too hard." I'm looking for any way to download
new/newer files on a specific list (wild cards won't make the proper
selection) where wget makes one connection and keeps it for the entire
oper
If wget is used with --input-file option, it gets directory listing
for each file specified in input file (if ftp protocol) before
downloading each file, which is quite annyoying if there are few
thousand of small files in the filelist, and every directory listing
is way longer then any file, i