[EMAIL PROTECTED] (Dan Sugalski)  wrote on 15.08.00 in 
<[EMAIL PROTECTED]>:

> At 06:04 PM 8/15/00 -0400, John Porter wrote:
> >Dan Sugalski wrote:
> > > >Generality good.
> > >
> > > For many things, yes. For computers, say. For people, no. Generality
> > > bad. Specificity and specialization good. People aren't generalists.
> > > They're a collection of specialists. The distinction is important.
> >
> >I'm sorry if I don't find this argument convincing.
> >This argument suggests that *every* type carry a distinguishing
> >prefix symbol -- including ones to distinguish between numbers and
> >strings, say.
>
> Numbers and strings really aren't different things, at least not as far as
> people are concerned.

Bovine excrement. Numbers and strings completely different things to  
people.

Hashes and arrays, OTOH, really aren't different for people. The concept  
of an index needing to be a nonnegative number is a computer concept.

>They are for machines, but computer languages
> ultimately aren't for machines, they're for people.

I agree that computer languages are for people - in fact, that's the sole  
reason they were invented. Otherwise we'd still program in machine  
language.

However, I do think your ideas about what does and does not come naturally  
to people are incredibly warped.

> I'm not presuming that, though there are plenty of languages already that
> have no symbols. Perl's not one of them, though.

I presume you mean $@% when you say symbols (and not, for example, the  
stuff that goes into a symbol table, because I certainly don't know any  
language that doesn't have those).

It's an open question if these are a good idea. I'm not convinced.  
Personally, I see them as one of the ugly warts in Perl.

> > > Even assuming highlander types, the punctuation carries a rather
> > > significant amount of contextual information very compactly.

s/significant/in&/ IMNSHO, even ignoring the "even" part.

> It's going to always be more difficult. You need to *think* to turn a word
> into a symbol. = is already a symbol. Less effort's needed.

Maybe *you* have to think to turn a word into a symbol. Most people seem  
to grow up with that concept.

As for *recognizing* the stuff, ever heard of redundancy? It's the stuff  
that makes recognizing stuff easier.

Words, by being longer, are easier to recognize than non-alphabetic  
symbols.

Now, as always, you have to strike a balance. We don't want COBOL. But we  
also don't want APL.

> > > ...exploiting instinct and
> > > inherent capabilities give you faster response times, and quicker
> > > comprehension.
> >
> >Sure.  But "instinct and inherent capabilities" do not apply here.
>
> Yes, they do. People write source. People read source. People are the most
> important piece of the system. The computer can be made to cope with the
> syntax, whatever the syntax is. People can't be made to cope nearly as
> easily, nor to nearly the same degree.

I completely agree with this point. Which is exactly why I disagree with  
most of your other points.


MfG Kai

Reply via email to