Hi,

Ian Piumarta <[email protected]> writes:

> It's an interesting take on "calculus of programming"!  (I'm not sure
> you can really call it differentiation since the function you're
> differentiating isn't continuous, and I'm not sure what a "tangent"
> might mean to a piece of code, and even then you might have to call it
> partial differentiation as there are often other independent variables
> that you could have chosen.  But I digress.)

That is a good point.  It's probably better to define and name new
operators.

> There are clear parallels between your differentiation and simple
> macro expansion. [...]  Actually, maybe what you are doing is more
> like partial evaluation (in the derivative direction)

Yes you're right, it is partial evaluation.  If I generalize what I said
earlier a little bit, we should be able to move any formal parameter
from the argument list into the method's lexical environment.

I'd suggest that kernels can optionally offer a symbol named '_' to
represent themselves as a value.

So what I called   d.meth(..., Arg, ...)
                   ---------------------
                           dArg

is better written in Sandstone code as a binding of Arg to the kernel

  .method2(...): bind(.meth(..., Arg, ...), 'Arg', _);

or more generally, for any value:

  .method2(...): bind(.meth(..., Arg, ...), 'Arg', Value);

is like writing:

  .method2(Arg1, Arg2): {
     Arg = Value;
     ... body ...
  };


> and partial abstraction (or whatever the opposite of evaluation is
> called, in the integrative direction).

Maybe "explicit", to move a binding from the lexical environment into a
formal parameter?

integral(.meth(...), {prim1, prim2})

goes away, to be replaced by a construct to make a lexical name into an
explicit parameter:

  .method2(..., ArgI, ...): explicit(.meth(...), '_', I);

These definitions make for a much saner implementation that needs only
create a new method binding with a different lexical environment and
formal parameters list.  No modification of the code body needs to be
made.

> The search for the meaning of eval is definitely a worthy one.
> I've been thinking of the meaning of eval more and more as a property
> of the environment, completely outside the physical statement of the
> program and independent of any intrinsic mechanisms of an
> implementation.
> The environment determines entirely the semantics of
> the program: it's a partial function between representation and
> meaning.  For compilation, which is a sequence of your
> ["binds"] or something similar, each step in the sequence is
> the environment partially evaluating the representation to get the
> next lower level of abstraction... until you reach machine code.
> There's a lot in common with kernels here, but maybe orthogonal to
> them.

What I'm calling a "kernel" is a simple generalization of the kernel
idea you and I developed last time I was in LA.  I've moved all the
compiler/interpreter functionality into a kernel which chooses how to
evaluate/compile the method, choosing to hide or make portions of itself
available via the syntax.

The above bind/escape methods are ways of exposing the kernel's ability
to compile to methods within the language.

> Instead of searching for more powerful or more primitive
> behaviours, the environments are mappings between more or less
> abstract representations of (hopefully) the same behaviour.

In the sane case.  In an less sane case, I consider the kernel interface
as a generic way of traversing the code of a method as a data structure.

> The evaluator in the tiny Lisp-like thing that I posted a couple of
> months ago (called Maru) works exactly this way.  The partial function
> aspect isn't explained in detail in the paper, but the key is to put
> the "evaluators" and "applicators" arrays in the environment; the rest
> follows easily from that.

I'm not sure I like that idea, but I'll have to see how you implemented
it.  I had fun with a system called Pliant (http://fullpliant.org/) that
was highly reflective in that way, but caused phase problems when trying
to cross-compile code because the compiler was an ambient authority and
fully first-class.

> I wonder if it's possible to "integrate" binary code to get the
> original source?  It would save an awful lot of tedious back pointers
> to intermediate structures and parser rules.  I'm going to have to
> think about this some more.

I'm not sure about that.  I think the sweet spot would be to find a
compact but lossless binary representation (of Sandstone, of course) and
run compilers, interpreters, debuggers, and decompilers on it.  I
consider Sandstone to be my answer to UNCOL or ANDF... it is a data
representation format that happens to support object methods in a
compact notation, but leaves all interpretation completely up to the
kernel.

Thanks for your comments,

-- 
Michael FIG <[email protected]> //\
   http://michael.fig.org/    \//

_______________________________________________
fonc mailing list
[email protected]
http://vpri.org/mailman/listinfo/fonc

Reply via email to