Of historical interest, here is a link to Kent Pitman's paper Special Forms
in Lisp <http://www.nhplace.com/kent/Papers/Special-Forms.html>. It
discusses a range of issues raised by FEXPRs (the mechanism for implicit
thunkification in LISP) and his decision *not* to adopt them in his Scheme
dialect NIL. The key statement is in the conclusion:

FEXPR's are only safe if no part of their ``argument list'' is to be
evaluated, and even then only when there is a declaration available in the
environment in which they appear. Using FEXPR's to define control primitives
will be prone to failure due to problems of evaluation context and due to
their potential for confusing program-manipulating programs such as
compilers and macro packages.

Surprisingly for Pitman (he's quite good), the conclusion engages in some
circular reasoning. It is tacitly predicated on the assumption that the
absence of declarations in LISP was a feature. Also, the conclusion doesn't
take into account challenges of macro
hygiene<http://en.wikipedia.org/wiki/Hygienic_macro>.
That discussion didn't enter the literature until 1992, when Kent Dybvig
proposed SYNTAX-CASE for the R4RS version of Scheme. True hygienic macros
took several years after that to be realized, and are now (I think) being
subsumed by the idea of staged compilation.

BitC (along with most typed languages) requires either definition or
declaration ahead of use, so that part of the argument doesn't hold. If we
start from the point of view that declarations are desirable, then
essentially all of Pitman's conclusions are reversed. Given declarations,
there is only a trivial difference from the tool/compiler perspective
between:

  f({ *expr* })  // explicit thunkification
  g(*expr*)     // implicit thunkification due to declaration by g

The argument that the compiler cannot optimize the two equivalently is
clearly falsified if we assume the presence of declarations. The main impact
from a tool perspective, is that it becomes harder for tools to process code
that does not compile.

In my view, for any part of the code that fails to compile, tools are
limited to quasi-syntactic processing in any case. The syntax of the
*expr*in the source code doesn't really change here, so it's hard to
see what (if
anything) is being lost at the call site from a tool perspective.

All this being said, I do agree with Mark Miller that implicit
thunkification is a feature with lots of potential for confusion, and one
that is best applied with a delicate touch.

shap
_______________________________________________
bitc-dev mailing list
[email protected]
http://www.coyotos.org/mailman/listinfo/bitc-dev

Reply via email to