Raul wrote:

>Anyways, there probably is a case to be made for brining back J's
>support for conjunction trains. I cannot make it myself, but you and
>Pepe could probably put together something coherent."

Something is cooking, stay tunned...


On Tue, Feb 19, 2013 at 10:16 AM, Raul Miller <[email protected]> wrote:

> On Mon, Feb 18, 2013 at 7:07 PM, Dan Bron <[email protected]> wrote:
> > [Reviving a thread from a couple weeks back]
> > Raul responded:
> >>  The sentences used in explicit definitions are subject to
> >>  the same algebraic manipulations.
> >
> > True, in the sense that both Lisp and C have macros.  I think it's worth
> > exploring the analogy (Lisp macros) : (C macros) :: (Tacit
> metaprogramming)
> > : (Explicit metaprogramming) .
>
> Perhaps. We should also be careful with this analogy -- it has some
> significant flaws -- but it is not entirely without use.
>
> > In functional programming circles, the distinction between textual
> > substitution (a-la C macros) and
> > manipulation-of-code-as-structured-data-objects (Lisp macros) is much
> > discussed, usually to emphasize the superiority of the Lisp approach over
> > C-like macros (which are excoriated).
>
> Note that there are several distinctions here:
>
> LISP - traditionally interpreted
> C - traditionally compiled
>
> LISP - can allow arbitrarily complex computation in macro contexts
> C - rigidly limited set of string manipulations allowed in macros
>
> LISP - traditionally not portable across architectures
> C - traditionally portable, albeit allowing target-specific code
>
> LISP - memory management usually built in
> C - memory management duties split across library and users
>
> Anyways, when talking about bodies of criticisms I think we have to be
> aware that people will often express criticism about one feature set
> as a consequence of frustrations caused by another (though often
> overlapping) feature set.
>
> That said, the advantage of the lisp approach is that having defined
> modular components to represent code is important when transforming
> that code.  And, ironically, the design of gcc (the gnu c compiler) is
> in this sense very "lisp-like" - perhaps more "lisp-like" than lisp
> (except, of course, in the exposed language semantics).
>
> But one thing that is lacking so far, in this discussion, is a concept
> behind &. in J, which in the context of code structure we might refer
> to using the word "serialization".
>
> Of course, at some point in an evaluation information must be
> discarded (for example: 32 { a.) and that's an irreversible process.
> [You can work around this issue by retaining references to other
> information but that has costs.]  On the other hand, if we are
> concerned with parse trees, and equivalent manipulations of parse
> trees, we should be working with reversible transformations.  And, in
> some cases (not all, but I'll save that discussion for later) using
> the reverse transformation (parsed modular representation of code back
> to textual representation) can be appropriate.
>
> > The general claim is that having well-understood object (with a
> documented
> > and analytical structure) to represent "latent code" allows
> metaprogrammers
> > to understand the role and context of that code in depth, as opposed to
> > relying on incomplete and sometimes-shaky inferences from an opaque
> string.
> > Furthermore (so the argument goes), having a well-understood code-object
> > allows us to make _substitutions_ with great confidence (formal
> > justification), leading to the much lauded referential transparency of
> > functional programming, aka "equational reasoning".
>
> I will not disagree here, but I will note that understanding is a much
> deeper subject than anything I am prepared to treat in this message.
>
> > It is this equational reasoning which I was trying to highlight in the
> > series of posts I made earlier in this thread, as a key advantage of
> tacit
> > programming over other, competing (but oftentimes complementary), styles.
>
> I agree here, especially with your mention of the complementary aspect.
>
> > With all that said, outside of J, I have limited experience in functional
> > programming, and only a surface understanding advantages it confers (or
> its
> > advocates claim for it, anyway). I have never programmed in Lisp,
> Scheme, or
> > Haskell, for example (outside of toying with the languages and their
> > environments).  So I'm interested to hear what others think of this Lisp
> :
> > Tacit :: C : Explicit analogy, or the relative amenability of tacit and
> > explicit code to equational reasoning.
>
> Note that we have two ways of distributing code: as text, and as
> binary.  I think that it's fair to say that the textual representation
> typically lends itself to understanding more than the binary
> representation.
>
> I think the real issue here is that having an intermediate
> representation gives us some transformational power.  But this is not
> the only kind of useful capability - simplicity, for example, can also
> be important (on the third hand, there are multiple kinds of
> simplicity).
>
> Anyways, there probably is a case to be made for brining back J's
> support for conjunction trains. I cannot make it myself, but you and
> Pepe could probably put together something coherent.
>
> Meanwhile, I would be cautious about meta-arguments.  Those kinds of
> things can be useful for finding topics of interest but I do not think
> they can replace concrete examples.  When we are entirely abstract
> it's easy to lose track of what we are talking about.
>
> Thanks,
>
> --
> Raul
> ----------------------------------------------------------------------
> For information about J forums see http://www.jsoftware.com/forums.htm
>
----------------------------------------------------------------------
For information about J forums see http://www.jsoftware.com/forums.htm

Reply via email to