Re: [Haskell-cafe] Re: What *is* a DSL?
On 2009-10-22 14:44, Robert Atkey wrote: On Sat, 2009-10-10 at 20:12 +0200, Ben Franksen wrote: Since 'some' is defined recursively, this creates an infinite production for numbers that you can neither print nor otherwise analyse in finite time. Yes, sorry, I should have been more careful there. One has to be careful to handle EDSLs that have potentially infinite syntax properly. I find it useful to carefully distinguish between inductive and coinductive components of the syntax. Consider the following recogniser combinator language, implemented in Agda, for instance: data P : Bool → Set where ∅ : P false ε : P true tok : Bool → P false _∣_ : ∀ {n₁ n₂} → P n₁ →P n₂ → P (n₁ ∨ n₂) _·_ : ∀ {n₁ n₂} → P n₁ → ∞? n₁ (P n₂) → P (n₁ ∧ n₂) The recognisers are indexed on their nullability; the index is true iff the recogniser accepts the empty string. The definition of P is inductive, except that the right argument of the sequencing combinator (_·_) is allowed to be coinductive when the left argument does not accept the empty string: ∞? : Set → Bool → Set ∞? true A = A ∞? false A = ∞ A (You can read ∞ A as a suspended computation of type A.) The limitations imposed upon coinduction in the definition of P ensure that recognition is decidable. For more details, see http://www.cs.nott.ac.uk/~nad/listings/parser-combinators/TotalRecognisers.html. -- /NAD This message has been checked for viruses but the contents of an attachment may still contain software viruses, which could damage your computer system: you are advised to perform your own checks. Email communications with the University of Nottingham may be monitored as permitted by UK legislation. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: What *is* a DSL?
On Sat, 2009-10-10 at 20:12 +0200, Ben Franksen wrote: Since 'some' is defined recursively, this creates an infinite production for numbers that you can neither print nor otherwise analyse in finite time. Yes, sorry, I should have been more careful there. One has to be careful to handle EDSLs that have potentially infinite syntax properly. I can see at least two solutions: One is to parameterize everything over the type of terminals, too. The second solution (which I followed) is to break the recursion by adding another nonterminal to the NT type: A third solution is to add the Kleene star to the grammar DSL, so the representation of productions becomes: data Production nt a = Stopa | TerminalChar (Production nt a) | forall b. NonTerminal (nt b) (Production nt (b - a)) | forall b. Star(Production nt b) (Production nt ([b] - a)) You also need to add the necessary parts for Alternative to the Production type too, because they may be nested inside Star constructors: | Zero | Choose (Production nt a) (Production nt a) The type Production nt is now directly an Applicative and an Alternative and also has the combinator: star :: Production nt a - Production nt [a] star p = Star p $ Stop id The type of grammars is changed to (with the additional of the starting nonterminal, as you point out): type Grammar nt = forall a. nt a - Production nt a It is probably also possible to write a function that converts grammars with “Star”s in to ones without by introducing new non-terminals in the way you did. Bob -- The University of Edinburgh is a charitable body, registered in Scotland, with registration number SC005336. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: What *is* a DSL?
On Sun, 2009-10-11 at 21:54 +0200, Ben Franksen wrote: Ben Franksen wrote: Next thing I'll try is to transform such a grammar into an actual parser... Which I also managed to get working. However, this exposed yet another problem I am not sure how to solve. Another option is to not use a recursive descent parser, and switch to a parsing algorithm for any context-free such as CYK or Earley's algorithm. A little test implementation of a well-typed version of the CYK algorithm seems to work and seems to be as efficient as the normal imperative one if enough memoisation is used. I'm trying to see if I can get Earley's algorithm to work nicely in the well-typed setting. Bob -- The University of Edinburgh is a charitable body, registered in Scotland, with registration number SC005336. ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: What *is* a DSL?
This problem of dynamically transforming grammars and bulding parsers out of it is addressed in: @inproceedings{1411296, author = {Viera, Marcos and Swierstra, S. Doaitse and Lempsink, Eelco}, title = {Haskell, do you read me?: constructing and composing efficient top-down parsers at runtime}, booktitle = {Haskell '08: Proceedings of the first ACM SIGPLAN symposium on Haskell}, year = {2008}, isbn = {978-1-60558-064-7}, pages = {63--74}, location = {Victoria, BC, Canada}, doi = {http://doi.acm.org/10.1145/1411286.1411296}, publisher = {ACM}, address = {New York, NY, USA}, } and the code can be found on hackage under the name ChristmasTree The left-factorisation is explained in a paper we presented at the last LDTA and which will appear in ENTCS. Since we have signed some copyright form I do notthink I can attach it here, but if you send me a mail, I can definitely send you the paper. Doaitse On 11 okt 2009, at 21:54, Ben Franksen wrote: Ben Franksen wrote: Next thing I'll try is to transform such a grammar into an actual parser... Which I also managed to get working. However, this exposed yet another problem I am not sure how to solve. The problem manifests itself with non-left-factored rules like Number ::= Digit Number | Digit Translating such a grammar directly into a Parsec parser leads to parse errors because Parsec's choice operator is predictive: if a production has consumed any input the whole choice fails, even if alternative productions would not: *Main P.parseTest (parseGrammar myGrm) 2 parse error at (line 1, column 2): unexpected end of input expecting Number Of course, one solution is to apply Parsec's try combinator to all choices in a rule. But this rather defeats the purpose of using a (by default) predictive parser in the first place which is to avoid unnecessary backtracking. So, a better solution is to left-factor the grammar before translating to Parsec. Since we have a data representation of the grammar that we can readily analyse and transform, this should be possible given some suitable algorithm. But how is this transformation to be typed? My first naive attempt was to define (recap: nt :: * - * is the type of nonterminals, t :: * is the type of terminals a.k.a. tokens, and a is the result type): leftFactor :: Grammar nt t a - Grammar nt t a Of course, this is wrong: Left-factoring is expected to introduce new nonterminals, so on the right-hand side of the type we should not have the same 'nt' as on the left. Instead we shoudl have some other type that is 'nt' extended with new constructors. Moreover, we cannot statically know how many new nonterminals are added, so we cannot simply apply a type function to nt. Is this solvable at all in Haskell or do I need proper dependent types to express this? I have very vague ideas that revolve around setting up some recursive type function that on each level adds one constructor, define a common interface with a (multiparam) type class and then use existential quantification in the result type to hide the resulting type of nonterminals. The next question is: Even if this turns out to be possible, isn't it overkill? Maybe it is better to use an infinite type for the nonterminals in the first place and let the grammar be a partial function? OTOH, the formulation of the grammar as a function that pattern matches on the nonterminals is very elegant. Cheers Ben ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: What *is* a DSL?
On Sun, Oct 11, 2009 at 06:29:58PM -0400, Brandon S. Allbery KF8NH wrote: On Oct 11, 2009, at 18:00 , Ben Franksen wrote: Ben Franksen wrote: Ben Franksen wrote: Next thing I'll try is to transform such a grammar into an actual parser... Which I also managed to get working. First, before all this talking to myself here is boring you to death, please shout and I'll go away. Anyway, at least one person has privately expressed interest, so I'll post my code for the translation.(*) It's -cafe, so let 'er rip. And maybe write it up for TMR, if you don't Yes please! =) -Brent ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: What *is* a DSL?
Ben Franksen wrote: Next thing I'll try is to transform such a grammar into an actual parser... Which I also managed to get working. However, this exposed yet another problem I am not sure how to solve. The problem manifests itself with non-left-factored rules like Number ::= Digit Number | Digit Translating such a grammar directly into a Parsec parser leads to parse errors because Parsec's choice operator is predictive: if a production has consumed any input the whole choice fails, even if alternative productions would not: *Main P.parseTest (parseGrammar myGrm) 2 parse error at (line 1, column 2): unexpected end of input expecting Number Of course, one solution is to apply Parsec's try combinator to all choices in a rule. But this rather defeats the purpose of using a (by default) predictive parser in the first place which is to avoid unnecessary backtracking. So, a better solution is to left-factor the grammar before translating to Parsec. Since we have a data representation of the grammar that we can readily analyse and transform, this should be possible given some suitable algorithm. But how is this transformation to be typed? My first naive attempt was to define (recap: nt :: * - * is the type of nonterminals, t :: * is the type of terminals a.k.a. tokens, and a is the result type): leftFactor :: Grammar nt t a - Grammar nt t a Of course, this is wrong: Left-factoring is expected to introduce new nonterminals, so on the right-hand side of the type we should not have the same 'nt' as on the left. Instead we shoudl have some other type that is 'nt' extended with new constructors. Moreover, we cannot statically know how many new nonterminals are added, so we cannot simply apply a type function to nt. Is this solvable at all in Haskell or do I need proper dependent types to express this? I have very vague ideas that revolve around setting up some recursive type function that on each level adds one constructor, define a common interface with a (multiparam) type class and then use existential quantification in the result type to hide the resulting type of nonterminals. The next question is: Even if this turns out to be possible, isn't it overkill? Maybe it is better to use an infinite type for the nonterminals in the first place and let the grammar be a partial function? OTOH, the formulation of the grammar as a function that pattern matches on the nonterminals is very elegant. Cheers Ben ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: What *is* a DSL?
Ben Franksen wrote: Ben Franksen wrote: Next thing I'll try is to transform such a grammar into an actual parser... Which I also managed to get working. First, before all this talking to myself here is boring you to death, please shout and I'll go away. Anyway, at least one person has privately expressed interest, so I'll post my code for the translation.(*) {-# LANGUAGE ExistentialQuantification, GADTs, Rank2Types #-} {-# LANGUAGE TypeSynonymInstances, MultiParamTypeClasses, ImpredicativeTypes #-} import qualified Text.ParserCombinators.Parsec as P Note that I have parameterized everything on the token (terminal) type. Here are the data types, adapting the rest of the code is completely mechanical. data Production nt t a = Stopa | Terminalt (Production nt t a) | forall b. NonTerminal (nt b) (Production nt t (b - a)) newtype Rule nt t a = Rule [Production nt t a] type RuleSet nt t = forall a. nt a - Rule nt t a type Grammar nt t b = (nt b, RuleSet nt t) I should probably turn this into a proper data type, which would BTW also make the ImpredicativeTypes extension unnecessary. Translation to Parsec - We restrict ourselves to Char as terminals for simplicity. The generalization to arbitrary token types would need another three arguments: showTok :: (tok - String), nextPos :: (SourcePos - tok - [tok] - SourcePos), and testTok :: (tok - Maybe a), which are needed by P.tokenPrim. parseGrammar :: Print nt = Grammar nt Char a - P.Parser a parseGrammar (start,rules) = parseNonTerminal rules start parseNonTerminal :: Print nt = RuleSet nt Char - nt a - P.Parser a parseNonTerminal rs nt = parseRule rs (rs nt) P.? pr nt parseRule :: Print nt = RuleSet nt Char - Rule nt Char a - P.Parser a parseRule rs (Rule ps) = P.choice (map ({- P.try . -} parseProduction rs) ps) parseProduction :: Print nt = RuleSet nt Char - Production nt Char a - P.Parser a parseProduction _ (Stop x) = return x parseProduction rs (Terminal c p) = P.char c parseProduction rs p parseProduction rs (NonTerminal nt p) = do vnt - parseNonTerminal rs nt vp - parseProduction rs p return (vp vnt) This is really not difficult, once you understand how the list-like Production type works. The trick is that a NonTerminal forces the rest of the production to return a function type, so you can apply its result to the result of parsing the nonterminal. Whereas the result of parsing terminals gets ignored by the rest of the production. You might wonder how the code manages to return the correct integer values inside an ENum. Well, I did, at least. I don't yet understand it completely but I think the answer is in in the Functor and Applicative instances: all the code that interprets syntactic elements (up to the abstract syntax) inside the myGrm function gets pushed down through the elements of a production until it ends up at a Stop, where we can finally pull it out (see the first clause of parseProduction). Note also the (commented-out) use of P.try in function parseRule. Let's try it: *Main putStrLn (printGrammar myGrm) *Start ::= Sum Sum ::= Product '+' Sum | Product Product ::= Value '*' Product | Value Value ::= Number | '(' Sum ')' Number ::= Digit Number | Digit Digit ::= '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9' *Main P.parseTest (parseGrammar myGrm) 2*(2+52) parse error at (line 1, column 2): unexpected * expecting Number After re-inserting the P.try call, I can actually parse expressions (yay!): *Main :r [1 of 1] Compiling Main ( Grammar.lhs, interpreted ) Ok, modules loaded: Main. *Main P.parseTest (parseGrammar myGrm) 2*(2+52) EProduct (ENum 2) (ESum (ENum 2) (ENum 52)) BTW, does anyone know a source (books, papers, blogs, whatever) about algorithms for automatic left-factoring? I searched with google and found some interesting papers on eliminating left recursion but nothing so far on left-factoring. Have these problems all been solved before the internet age? Cheers Ben (*) One of these days I really should get my hands dirty and set up a weblog; suggestions for how to proceed are appreciated. I would especially like something where I can just upload a literate Haskell file and it gets formatted automagically. Bonus points for beautifying operator symbols a la lhs2tex ;-) ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: What *is* a DSL?
On Oct 11, 2009, at 18:00 , Ben Franksen wrote: Ben Franksen wrote: Ben Franksen wrote: Next thing I'll try is to transform such a grammar into an actual parser... Which I also managed to get working. First, before all this talking to myself here is boring you to death, please shout and I'll go away. Anyway, at least one person has privately expressed interest, so I'll post my code for the translation.(*) It's -cafe, so let 'er rip. And maybe write it up for TMR, if you don't want to set up a blog with all the goodies? -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allb...@kf8nh.com system administrator [openafs,heimdal,too many hats] allb...@ece.cmu.edu electrical and computer engineering, carnegie mellon universityKF8NH PGP.sig Description: This is a digitally signed message part ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
[Haskell-cafe] Re: What *is* a DSL?
Robert Atkey wrote: A deep embedding of a parsing DSL (really a context-sensitive grammar DSL) would look something like the following. I think I saw something like this in the Agda2 code somewhere, but I stumbled across it when I was trying to work out what free applicative functors were. [snip code explanation] This is extremely cool. I tried to understand in my head how this all works but it just didn't click. It all seemed like magic. Then I went ahead and tried to write a printer for your example grammar and now everything is much clearer. Although I had to fight the type checker quite a bit. This is the generic part: class Print f where pr :: f a - String instance Print nt = Print (Production nt) where pr = printProduction printProduction :: Print nt = Production nt a - String printProduction (Stop _) = printProduction (Terminal t (Stop _)) = show t printProduction (Terminal t p) = show t ++ ++ printProduction p printProduction (NonTerminal nt (Stop _)) = pr nt printProduction (NonTerminal nt p) = pr nt ++ ++ printProduction p instance Print nt = Print (Rule nt) where pr (Rule ps) = printPs ps where printPs [] = printPs [p]= printProduction p printPs (p:ps) = printProduction p ++ | ++ printPs ps data Any f = forall a. Any (f a) class Enumerable f where enumeration :: [Any f] printRule :: Print nt = (nt a - Rule nt a) - nt a - String printRule g nt = pr nt ++ ::= ++ pr (g nt) printGrammar :: (Print nt, Enumerable nt) = Grammar nt - String printGrammar g = foldr (++) (intersperse \n rules) where rules = map printAnyRule enumeration printAnyRule (Any nt) = printRule g nt We must also provide instances for the concrete types: instance Enumerable NT where enumeration = [Any Sum, Any Product, Any Value] instance Print NT where pr Value = Value pr Product = Product pr Sum = Sum So far so good. This even works... almost ;-) *Main putStrLn $ printGrammar myGrm Sum ::= Product '+' Sum | Product Product ::= Value '*' Product | Value Value ::= Interrupted. -- had to hit Ctrl-C here When I replace 'posInt' with 'digit' in the rule for Value myGrm Value = ENum $ digit | id $ char '(' * nt Sum * char ')' then the printer terminates just fine: *Main putStrLn $ printGrammar myGrm Sum ::= Product '+' Sum | Product Product ::= Value '*' Product | Value Value ::= '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9' | '(' Sum ')' I found that the problem is the use of function 'some' from Control.Applicative in posInt :: Rule nt Int posInt = fix 1 . reverse $ some digit where fix n [] = 0 fix n (d:ds) = d*n + fix (n*10) ds Since 'some' is defined recursively, this creates an infinite production for numbers that you can neither print nor otherwise analyse in finite time. I can see at least two solutions: One is to parameterize everything over the type of terminals, too. A type suitable for the example would be data T = TNum Int | TPlus | TMult | TOParen | TCParen and leave token recognition to a separate scanner. The second solution (which I followed) is to break the recursion by adding another nonterminal to the NT type: data NT a where Sum :: NT Expr Product :: NT Expr Value :: NT Expr Number :: NT [Int] Digit :: NT Int instance Enumerable NT where enumeration = [Any Sum, Any Product, Any Value, Any Number, Any Digit] instance Print NT where pr Sum = Sum pr Product = Product pr Value = Value pr Number = Number pr Digit = Digit (Adding Digit /and/ Number is not strictly necessary, but it makes for a nicer presentation.) myGrm :: Grammar NT myGrm Sum = ESum $ nt Product * char '+' * nt Sum | id $ nt Product myGrm Product = EProduct $ nt Value * char '*' * nt Product | id $ nt Value myGrm Value = (ENum . toNat) $ nt Number | id $ char '(' * nt Sum * char ')' myGrm Number = extend $ nt Digit * optional (nt Number) myGrm Digit = digit extend d Nothing = [d] extend d (Just ds) = d:ds toNat :: [Int] - Int toNat = fix 1 . reverse where fix n [] = 0 fix n (d:ds) = d*n + fix (n*10) ds With this I get *Main putStrLn $ printGrammar myGrm Sum ::= Product '+' Sum | Product Product ::= Value '*' Product | Value Value ::= Number | '(' Sum ')' Number ::= Digit Number | Digit Digit ::= '0' | '1' | '2' | '3' | '4' | '5' | '6' | '7' | '8' | '9' Morale: Be careful with recursive functions when constructing a data representation (e.g. for a deep DSL). You might get an infinite representation which isn't what you want in this case. Oh, and another point: there should be a distinguished start nonterminal, otherwise this is not really a grammar. This suggests something like type Grammar nt a = (nt a, forall b. nt b - Rule nt b)
[Haskell-cafe] Re: What *is* a DSL?
minh thu wrote: 2009/10/7 Günther Schmidt gue.schm...@web.de: I've informally argued that a true DSL -- separate from a good API -- should have semantic characteristics of a language: binding forms, control structures, abstraction, composition. Some have type systems. That is one requirement that confuses me, abstraction. I thought of DSLs as special purpose languages, ie. you give your DSL everything it needs for that purpose. Why would it also need the ability to express even further abstractions, it is supposed to *be* the abstraction. Programming abstractions at the DSL level, not to further abstract what the DSL covers. Functions, for instance, are typical abstraction means offered by programming languages. Even if your language is specific to some domain, being able to create your own functions, and not only rely on those provided by the DSL implementation, is important. Imagine a (E)DSL for 3D programming (e.g. shading language): the language is designed to fit well the problem (e.g. in this case, 3D linear algebra, color operations, ...) but you'll agree it would be a shame to not be able to provide your own functions. But isn't one of the advantages of an _E_DSL that we can use the host language (Haskell) as a meta or macro language for the DSL? I would think that this greatly reduces the need to provide abstraction facilities /inside/ the DSL. In fact most existing (and often cited examples of) EDSLs in Haskell do not provide abstraction. Cheers Ben ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: What *is* a DSL?
On Wed, Oct 7, 2009 at 2:52 PM, Ben Franksen But isn't one of the advantages of an _E_DSL that we can use the host language (Haskell) as a meta or macro language for the DSL? Substantially so. I've used brief examples where the EDSL syntax is basically the data declaration (perhaps with some operators overloading constructors) to demonstrate Haskell's fitness as a host language for EDSLs. This is also a credit to the expressiveness of Haskell's data declarations. /jve ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: What *is* a DSL?
2009/10/7 Ben Franksen ben.frank...@online.de: minh thu wrote: 2009/10/7 Günther Schmidt gue.schm...@web.de: I've informally argued that a true DSL -- separate from a good API -- should have semantic characteristics of a language: binding forms, control structures, abstraction, composition. Some have type systems. That is one requirement that confuses me, abstraction. I thought of DSLs as special purpose languages, ie. you give your DSL everything it needs for that purpose. Why would it also need the ability to express even further abstractions, it is supposed to *be* the abstraction. Programming abstractions at the DSL level, not to further abstract what the DSL covers. Functions, for instance, are typical abstraction means offered by programming languages. Even if your language is specific to some domain, being able to create your own functions, and not only rely on those provided by the DSL implementation, is important. Imagine a (E)DSL for 3D programming (e.g. shading language): the language is designed to fit well the problem (e.g. in this case, 3D linear algebra, color operations, ...) but you'll agree it would be a shame to not be able to provide your own functions. But isn't one of the advantages of an _E_DSL that we can use the host language (Haskell) as a meta or macro language for the DSL? It is. I would think that this greatly reduces the need to provide abstraction facilities /inside/ the DSL. In fact most existing (and often cited examples of) EDSLs in Haskell do not provide abstraction. Even when you have good macro supports, you don't code everything at the macro level. But it all depends on the particular EDSL we talk about. If the EDSL is close to a regular programming language, it is likely to provide the ability to create functions. Cheers, Thu ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
Re: [Haskell-cafe] Re: What *is* a DSL?
Ben Franksen skrev: minh thu wrote: 2009/10/7 Günther Schmidt gue.schm...@web.de: I've informally argued that a true DSL -- separate from a good API -- should have semantic characteristics of a language: binding forms, control structures, abstraction, composition. Some have type systems. That is one requirement that confuses me, abstraction. I thought of DSLs as special purpose languages, ie. you give your DSL everything it needs for that purpose. Why would it also need the ability to express even further abstractions, it is supposed to *be* the abstraction. Programming abstractions at the DSL level, not to further abstract what the DSL covers. Functions, for instance, are typical abstraction means offered by programming languages. Even if your language is specific to some domain, being able to create your own functions, and not only rely on those provided by the DSL implementation, is important. Imagine a (E)DSL for 3D programming (e.g. shading language): the language is designed to fit well the problem (e.g. in this case, 3D linear algebra, color operations, ...) but you'll agree it would be a shame to not be able to provide your own functions. But isn't one of the advantages of an _E_DSL that we can use the host language (Haskell) as a meta or macro language for the DSL? I would think that this greatly reduces the need to provide abstraction facilities /inside/ the DSL. In fact most existing (and often cited examples of) EDSLs in Haskell do not provide abstraction. I would say that the DSL is what the user sees. In this view, I think it's correct to say that many (or most) DSLs need function abstraction. Whether or not the internal data structure has function abstraction is an implementation detail. / Emil ___ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe