[ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ]
If I may chime in. The original point made by Uday re classical logic:
Coming to classical logic per se, I believe it is ill-fit for describing
computer programs or "processes"
is certainly worthy of attention. But it does not seem to imply the conclusion
of that mail:
It is time to leave behind the classical logic. In fact, we should have
done it a long time ago.
(even if it wasn't intended, it does indeed sound "like a total and unconditional
rejection"... such things happen in the fervor of a discussion :-)
"Logical pluralism" is a position rather well-established in the philosophy of
logic. I would think that in the context of Computer Science, it is even more tempting.
[incidentally and perhaps contrary to established views, even Brouwer himself
could be perhaps seen as one of first logical pluralists. While he very
strongly rejected Fregean-Russellian logicism in *foundations of mathematics*,
he has always held the work of Boole and the whole algebraic tradition in logic
in high regard... But this is an aside]
It might even happen to be Uday's own position, if I understand correctly the remark that
"we can perfectly well circumscribe various regimens for various purposes."
Most of my email will elaborate on this.
I would simply say that whenever one wants, needs or has to think of all propositional
formulas (also those possibly involving implication, and also those involving fusion,
"tensor product" or what have you) as *rewritable to a conjunctive-disjunctive
normal form without loss of information*, then the underlying domain logic is essentially
classical.
It is hard to imagine whole areas of Theoretical CS if rewriting formulas to
CNF or proofs by contradiction/contraposition/excluded middle are suddenly
deemed outdated and/or illegal... I mean not only and not even primarily logic
programming, but also finite model theory, complexity theory,
ontologies/description logics or the whole PODS/VLDB community...
[actually, as a curious aside, the logic of database theorists, while certainly not constructive, is not fully
classical either. They dread the top constant and unrestricted negation, preferring instead relative complement. This
has to do with assumptions such as "closed world", "active domain" and the demand that queries are
"domain independent". In short, their logic is rather that of Boolean rings without identity, which---funnily
enough---also happen to be the setting of Stone's original work. It is just contemporary and ahistorical
misrepresentation to say that Stone was working with "Boolean algebras". But this, again, is an aside...]
And even in the context of Curry-Howard correspondence, classical logic is a
legitimate setting to discuss languages with control operators, first-class
continuations, static catch/throw a la Scheme etc. There is so much stunningly
beautiful work in that community that deserves to be better known...
But, equally obviously, not all the programming languages have such constructs.
Furthermore, as linear logicians (already mentioned by Uday) will be happy to
tell you, there are contexts when even intuitionistic notion of implication (so
also the one of topos-theorists or proof-assistants, for example) is way too
coarse-grained. Particularly when one wants, needs or has to be resource-aware.
Also, the recent work of Wadler, Pfenning and other authors suggests that
Curry-Howard correspondence for concurrency will have to do with linear rather
than intuitionistic logic.
[And as substructural logicians will be happy to tell you, there are contexts
where even linear logicians may seem coarse-grained, thick-skinned,
corner-cutting brutes. :-) But this, yet again, is an aside.]
But where I most likely would part ways with Uday is when he claims (if I understand
correctly) that we are approaching or even should approach "a final answer" of
any kind. To me, searching for one logic valid in all CS-relevant contexts seems a rather
misguided enterprise. Especially or at least when we talk about logic understood as a
formal inference system.
What we perhaps need is more introductory logic courses---and also handbooks
and monographs---for budding CS undergraduates and graduates (and perhaps also
some postgraduates) which would make them understand the subtlety and
complexity of the picture. And the benefits and costs of adopting specific
inference rules.
Proof-assistant based courses seem to go in just the right direction. I am teaching right
now one based on that excellent "Software Foundations" material of Benjamin
Pierce et al. I think it changes and sharpens not only the thinking of students, but also
that of the teacher himself (or herself :-).
But even this only goes so far---after all, the underlying logic is essentially
intuitionistic... on the other hand, any weaker one could quickly become a
nightmare for actually discussing things as demanding as semantics of
programming languages (with bangs and exclamation marks in every second
lemma... :-)
To conclude, a few minor points:
In fact, we cannot accept that we have a final answer until the entire natural
language has been formalized
We'll wait for this only a little longer than for the invention of perpetuum
mobile and heat death of the universe... :-)
And which "natural language" are we talking about? Sometimes I think the only reason why,
e.g., Chomsky ever came up with the idea of "universal grammar" was that he did not speak
too many languages in the first place (although Hebrew seems reasonably distant from English)...
(The view I take, following Quine, is that logic is a regimentation of natural
language.
Same objection as above, and this is just to begin with.
[The only redeeming features of Quine were that he wrote well and had a certain logical
culture. As a philosopher, in my opinion, he had a rather destructive influence on
development of logic, particularly in philosophy departments, even if nowhere near as
disastrous as the neopositivists or the majority of "analytic philosophers".
But this is just one more aside...]
We can perfectly well circumscribe various regimens for various purposes.
As said above, I'm perfectly in agreement with this statement.
I am entirely happy with the characterization of logical connectives as "information
composition" operators. But we can only accept it as a good, but vague, intuition. We do not
know what this "information" is. Neither do we know what the information is about. So, in
order to claim that classical logic is a canonical information composition calculus, somebody would
need to formalize those notions.
I think I can agree with every word here. Perhaps the difference then is not so
big...
I guess then that "leaving classical logic behind" meant rather "stop presenting it to students
as the only, final and >>real<< formalism for Computer Scientists, everything else being a marginal
pathology, if mentioned at all"... and if this was indeed intended by this remark, I would have a hard
time disagreeing.
Okay... back then to popcorn and a comfortable seat in the audience...
Best,
t.