Hash: SHA1

Petri Koistinen wrote:
> Hi,
> I would be nice if wget would also support file://.

Feel free to file an issue for this (I'll mark it "Needs Discussion" and
set at low priority). I'd thought there was already an issue for this,
but can't find it (either open or closed). I know this has come up
before, at least.

I think I'd need some convincing on this, as well as a clear definition
of what the scope for such a feature ought to be. Unlike curl, which
"groks urls", Wget "W(eb)-gets", and file:// can't really be argued to
be part of the web.

That in and of itself isn't really a reason not to support it, but my
real misgivings have to do with the existence of various excellent tools
that already do local-file transfers, and likely do it _much_ better
than Wget could hope to. Rsync springs readily to mind.

Even the system "cp" command is likely to handle things much better than
Wget. In particular, special OS-specific, extended file attributes,
extended permissions and the like, are among the things that existing
system tools probably handle quite well, and that Wget is unlikely to. I
don't really want Wget to be in the business of duplicating the system
"cp" command, but I might conceivably not mind "file://" support if it
means simple _content_ transfer, and not actual file duplication.

Also in need of addressing is what "recursion" should mean for file://.
Between ftp:// and http://, "recursion" currently means different
things. In FTP, it means "traverse the file hierarchy recursively",
whereas in HTTP it means "traverse links recursively". I'm guessing
file:// should work like FTP (i.e., recurse when the path is a
directory, ignore HTML-ness), but anyway this is something that'd need

- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer.
GNU Maintainer: wget, screen, teseq
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org


Reply via email to