begin  quoting John H. Robinson, IV as of Sun, Mar 05, 2006 at 09:01:54AM -0800:
[snip]
> Artificial limitations are eviler.

Disallowing spaces is a special case of disallowing user-imposed limitations;
the user should _always_ be allowed to impose whatever limitations they
choose.  It's their machine, after all.

>                                    I will stick with being able to
> handle any legal character, this includes backspaces, newlines, and
> horizontal tabs in addition to spaces.

And nulls. Don't forget nulls.
 
> What is truly evil is disk-editing a solidus into a filename.

Nah. Ignoring the user's wishes, that's the problem. "This is what I
think you ought to want to do, therefore, I'm going to enforce that."

> The only reason I can think that a space in a filename would be
> considered evil is because the space is a token separator on the shell
> command line.

It's also the token separator when reading prose; the eye is trained to
tokenize on spaces.

[snip - bourne-shell brokenness]
>                                          Since no one put spaces in
> filenames, people got used to thinking that $a was sufficient for
> filename use.

And it is, given the reasonable constraints. Remember, the computer is
there for the human; if there's tedious manual work involved, it should
be performed by the computer, not the human. 
 
> This is, what I consider, a poor design initially. However, the mind-set
> has stuck, and some modern shells still maintain this broken behaivour
> even though they have proper array variables now.

There have been systems that did not use spaces for tokenization; they
are generally (in my opinion) more difficult to use; this is reflected
in the MSDOS use of / as an option indicator -- and all the fun that
results from that. And fortran had the worst of both worlds: mandatory
leading whitespace and optional whitespace everywhere else.

-- 
_ |\_
 \|


-- 
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list

Reply via email to