Quoting Chunks ([EMAIL PROTECTED]):
> I did RTFM, and the links to any mailing list archives I could find
> were broken. Please accept my apologies in advance if this is
> something covered elsewhere. Perhaps ignoring permissions will take
> care of it?
Could you tell us which links were actuall
Quoting Jamie Zawinski ([EMAIL PROTECTED]):
> However, that said, I still think wget should do what Netscape does,
> because that's what everyone expects. The concept of a "default
> directory" in a URL is silly.
The correct approach would be to try "CWD url/dir/path/" (the correct
meaning) an
Hanno Foest wrote:
>
>> ftp://ftp.redhat.com/pub/redhat/updates/7.0/i386/apache-devel-1.3.14-3.i386.rpm
>> ftp://ftp.redhat.com//pub/redhat/updates/7.0/i386/apache-devel-1.3.14-3.i386.rpm
...
> I don't think so. The double slash in front of the path part of the URL
> starts the path in the ft
Quoting Hanno Foest ([EMAIL PROTECTED]):
> On Mon, Feb 26, 2001 at 12:46:51AM -0800, Jamie Zawinski wrote:
>
> > Netscape can retrieve this URL:
> >
> > ftp://ftp.redhat.com/pub/redhat/updates/7.0/i386/apache-devel-1.3.14-3.i386.rpm
> >
> > wget cannot. wget wants it to be:
> >
> > ftp:
On Mon, Feb 26, 2001 at 12:46:51AM -0800, Jamie Zawinski wrote:
> Netscape can retrieve this URL:
>
> ftp://ftp.redhat.com/pub/redhat/updates/7.0/i386/apache-devel-1.3.14-3.i386.rpm
>
> wget cannot. wget wants it to be:
>
> ftp://ftp.redhat.com//pub/redhat/updates/7.0/i386/apache-devel-1
I'm mirroring a very large tree locally. As the tree is larger
than the local filesystem, I periodically stop wget, save what
I've downloaded on CD-ROM, truncate the saved files to 0 and
then start wget -N -r again to get more files.
Unfortunately, wget checks not only the mtime but also the size
Netscape can retrieve this URL:
ftp://ftp.redhat.com/pub/redhat/updates/7.0/i386/apache-devel-1.3.14-3.i386.rpm
wget cannot. wget wants it to be:
ftp://ftp.redhat.com//pub/redhat/updates/7.0/i386/apache-devel-1.3.14-3.i386.rpm
I believe the Netscape behavior is right and the wget behavio
If I specify a whole bunch of ftp: URLs to wget that are on the same
host, it opens a new connection to the server for each one. It should
reuse the same connection if they're on the same host.
--
Jamie Zawinski
[EMAIL PROTECTED] http://www.jwz.org/
[EMAIL PROTECTED] http://ww