-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

David wrote:
> 
> Hi Micah,
> 
> Your're right - this was raised before and in fact it was a feature
> Mauro Tortonesi intended to be implemented for the 1.12 release, but it
> seems to have been forgotten somewhere along the line. I wrote to the
> list in 2006 describing what I consider a compelling reason to support
> file:// <file:///>. Here is what I wrote then:
> 
> At 03:45 PM 26/06/2006, David wrote:
> In replies to the post requesting support of the "file://" scheme,
> requests were made for someone to provide a compelling reason to want to
> do this. Perhaps the following is such a reason.
> I have a CD with HTML content (it is a CD of abstracts from a scientific
> conference), however for space reasons not all the content was included
> on the CD - there remain links to figures and diagrams on a remote web
> site. I'd like to create an archive of the complete content locally by
> having wget retrieve everything and convert the links to point to the
> retrieved material. Thus the wget functionality when retrieving the
> local files should work the same as if the files were retrieved from a
> web server (i.e. the input local file needs to be processed, both local
> and remote content retrieved, and the copies made of the local and
> remote files all need to be adjusted to now refer to the local copy
> rather than the remote content). A simple shell script that runs cp or
> rsync on local files without any further processing would not achieve
> this aim.

Fair enough. This example at least makes sense to me. I suppose it can't
hurt to provide this, so long as we document clearly that it is not a
replacement for cp or rsync, and is never intended to be (won't handle
attributes and special file properties).

However, support for file:// will introduce security issues, care is needed.

For instance, file:// should never be respected when it comes from the
web. Even on the local machine, it could be problematic to use it on
files writable by other users (as they can then craft links to download
privileged files with upgraded permissions). Perhaps files that are only
readable for root should always be skipped, or wget should require a
"--force" sort of option if the current mode can result in more
permissive settings on the downloaded file.

Perhaps it would be wise to make this a configurable option. It might
also be prudent to enable an option for file:// to be disallowed for root.

https://savannah.gnu.org/bugs/?24347

If any of you can think of additional security issues that will need
consideration, please add them in comments to the report.

- --
Micah J. Cowan
Programmer, musician, typesetting enthusiast, gamer.
GNU Maintainer: wget, screen, teseq
http://micah.cowan.name/
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.7 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFI19aE7M8hyUobTrERAt49AJ4irLGMd6OVRWeooKPqZxmX0+K2agCfaq2d
Mx9IgSo5oUDQgBPD01mcGcY=
=sdAZ
-----END PGP SIGNATURE-----

Reply via email to