Re: Computers are systems not languages

2011-02-24 Thread alex
On 24 February 2011 01:15, Richard O'Keefe  wrote:
> This is the same Naur who wrote
>
>        I cannot help expressing a feeling of awkwardness at the use of
>        the word language in the context "programming language."
>        I definitely feel that if taken literally this habit of expression
>        is misleading.
>        As a first step to subdue our feeling of guilt at this misuse of
>        the term, perhaps we can remind ourselves that logicians and
>        mathematicians used the word for specialized notations long before
>        we did, already in the 1930's, and perhaps even before.
>        ...
>        Several of the social aspects of mathematics and natural languages
>        show a meaningful analogy with similar aspects of programming 
> languages.
>        It therefore makes sense to extrapolate the analogy to further such
>        aspects.

Thanks.  That could almost be a missing conclusion to the end of that essay.

> Someone who uses language to tell us something is by his actions
> *denying* that meaning is a personal matter, for if it were so, he
> might as well expect the same outcome from grunting and scratching
> his chest.

No his point isn't as extreme.  Just that that meaning isn't in the
words, or in the world, but in our heads.  Cognitive linguists such as
George Lakoff, Lawrence Barsalou and Peter Gärdenfors would agree.
Language then isn't a direct transferral of meaning, but far messier
than that, as whatever you say will be interpreted within a rather
different frame of reference.  From the preface of Computing: A human
activity, explaining the motivation for the anthology:

  "... in order to grasp the ideas, the opinions, the points of view,
etc., of another person it is not sufficient, or even necessary, to
understand or accept a definite set of themes or notions.  Rather,
what is required is that we are exposed to a full and varied
expression of that person, related to many different issues of
concern. By being thus exposed we may hope to recognize how the
patterns of the other person's ideas, opinions, points of view, partly
match, partly extend those we have ourselves already.  This manner of
gaining insight into and learning from another person may well be a
slow development, requiring repeated readings of the person's writing
and months of years of digestion."

Perhaps you prove him right by adding that nuance to his point of view above :)

alex

-- 
http://yaxu.org/


-- 
The Open University is incorporated by Royal Charter (RC 000391), an exempt 
charity in England & Wales and a charity registered in Scotland (SC 038302).



Re: Redefining the word "language"

2011-02-24 Thread Adam Smith
Kari and Richard's attention to symbols, definition, and meaning is
highly appropriate, but there's another angle at play here which I
think is more central to the language-ness of programming languages.
I'd like to share an analogy that's stuck with me for several when
thinking about the distinction between "programming" and "natural" as
modifiers for the concept of language (particularly one that tries to
side-step discussion of symbols).

Consider the representation of some physical value such as the
temperature of a room. We can make a "programming" representation of
this value with some configuration of a fixed number of bits (an int
in a machine word). Likewise, we can make a "natural" representation
of the value with, um, another physical value such as a voltage or
height of a mercury column in a thermometer. The programming
representation has a very chunky, discrete range of expression with
delivers a set number of bits of information. Meanwhile, the natural
representation smoothly ranges over some continuous domain, conveying
a fuzzy/undefined amount of information bounded by a complex
interaction of external noise sources and sampling error.

In programming languages, we've got these very discrete sequences of,
say, ascii characters, some subset of which are blessed by a
particular context-free grammar to be valid. Program code isn't
bounded in the same way as a 32-bit unsigned int, but it has a similar
discrete feeling. Meanwhile, natural language encompass a seemingly
infinite domain of expression, but the deeper we dig in extracting
information from an utterance, the more we need to make assumptions
about where the message came from and how well we heard it.

Programming representations invite us to interpret
(parse/compile/execute/etc.) them once and be confident that we got
everything important on the first try. Natural representations invite
us to repeatedly ask refining questions, allowing subsequent samples
to change our mind about inferences from the first question. And ADC
can convert a noisy (at some level) electrical signal into a discrete
digital symbol by repreatedly checking whether the signal is currently
above or below certain reference voltages. Indeed any amount of
digital information can be packed into some real value
(http://en.wikipedia.org/wiki/Arithmetic_coding), but for practical
purposes you will stop after some number of iterations. Natural
language can be subject to a similar process where seemingly limitless
information can be sucked out of a fixed natural language input via
close reading and the asking of a lot of tiny questions
(http://en.wikipedia.org/wiki/Deconstruction seems to experimentally
probe for the practical limits here).

This analogy between languages and simple physical values gives us a
way to talk about particular non-natural-nesses of programming
languages without references to symbols and semantics. (And if code is
data and you can store any data in a big fat bitstring, then it
shouldn't be saying anything too controversial.) But I'm somewhat of
an advocate for programming languages as languages, so now I want to
show how practical programming representations regularly find
themselves creeping in the natural direction.

Consider the generation of html documentation from java code using
javadoc. The official java compiler immediately throws away a lot of
information when it reads your source (comments, indentation, the
order of certain declarations, etc.). The javadoc code analyser, on
the other hand, makes a few assumptions about where the code came from
(that the programmer followed certain common practices). This allows
it to slurp up and save many comments and tie them to the constructs
they describe (conventionally, the declaration on the next line). The
tool remembers enough of your (not-so-superfluous) code formatting to
provide click-through links to particular locations in the source.
Certainly human java programmers "read into the text" a lot more than
the official compiler does, but this extended, extra-grammatical
interpretation process is not exclusive to humans.

Because I'm a fan of Richard's, this next example uses Prolog.
Metaprogramming regularly involves reading deeper and deeper meanings
from a snippet of object language (with the meta-language being
something we are comfortable calling a programming language). The
prolog snippet "connect_via(kitchen,dining_room,west)." is a 100%
complete and valid program, but it doesn't do much to execute it
(other than populate a conceptual table). By piling on more and more
assumptions in the surrounding code, we can infer from this snippet
(1) an instruction in the larger process of building a house, (2) a
specification for how some existing house was built, (3) a description
of (query for) houses out of some external database, or many other
things. It's not that we can get these meanings the snippet ended with
a full stop making it a complete statement, just knowing that a loose
term of tha