On 02/06/06, A. Pagaltzis <[EMAIL PROTECTED]> wrote:
* Michael Mathews <[EMAIL PROTECTED]> [2006-06-02 12:10]:
> I would also point to PHP's fread() (and friends) functions,
> which take "filenames" in the form of "C:\\blah" or
> "http://foo.com/blah" and behaves the same.
Yes, which means that any un- (or badly) sanitised user input has
the potential to make a program fetch data from a remote server,
which is particularly dumb when you consider that `require`
offers the same misfeature. Makes exploits so much easier.
No, HTTP is not like opening local files and should not be
treated as such. The failure modes also differ entirely.
The right place to abstract this, if you want to, is at the
networking layer, which could treat local files as a special form
of remote ones, which can be done simply by accepting `file://`
URIs in addition to `http://` and `ftp://`. This way around works
much better for a range of reasons. (I'm too lazy to go into them
right now but will if you ask.)
That doesn't belong in the language core, though.
That last point is very good argument, but oddly isn't to do with
anything I said (or that I've read in this thread for that matter). I
believe I said "in Perl6 modules" not "in the language core" as you
imply, and I read Josh's post to be about a so-called "core"
*distribution* with network *modules* included, but again, not about
the language core.
The example I gave from PHP was just one example of how "they" do it.
We can abstract it in whatever layer we want, but I think the point
was that accessing files in an increasingly networked world requires
network-smart modules, so some of those should be on the short list
for early porting, at least that was what I was trying to say.
As for writing bad code that is therefore easy to exploit, I wasn't
suggesting that either. I personally advise taint be turned on for ALL
web code, but that's another thread I believe, and this one has
already started to unravel. :-)