On Wed, 2006-09-06 at 17:19 -0400, Peter Tanski wrote:
> O.K., now. Sorry for the late reply. Thanks for answering my pseudo-
> rants; I apologise.
Lol .. i'm the rant master :)
> I would help with the documentation problems but
> since I have little idea of what is available in this language I am
> rather stuck.
Yup, the tutorial and regression tests are the best source.
> On the matter of using a C++ list<>, I meant implement your own
> container (I could at least help with that).
We could, but why? The Felix one should generate good code.
If it doesn't, we need to fix Felix so it does. It should
be faster than both Ocaml and Haskell (once the GC overhead
is removed) because Felix products are not boxed.
> The gcc-STL does allow
> you to override the allocator but you would still run into the
> problem of storing higher-level values (such as a list of lambda
> expressions) in it.
Yup. Overriding the allocator doesn't help: we need shapes
for all the structs, which tell where pointers are: hard to
know that for foreign code.
> The stream solution looks nice.
>
> As for syntax, I would fairly say that less than half of my friends
> who know and work with C++ on a daily basis actually like C++ syntax,
> despite its other advantages; the rest continue to dabble in
> languages they *wish* they could use at work :)
Indeed. There's a tradeoff. Being able to point at Felix code
and say "See, it's really an upgrade of C++" is part of the
design goal: bosses need convincing too, and they only see
concrete syntax. So I use
#include <jhg>
#define
etc for the preprocessor directives .. even though the
semantics are quite different, eg: #define'd symbols
can ONLY be used in #if style conditional compilation:
the only # macros that affect source code are
user defined syntax things like #statement.
> It isn't out of the
> question for me to write a lexer in Felix's own GLR--assuming I
> actually start to understand what Felix is about so I might inject
> some consistency between functional syntax and the functionality.
You could .. there's also a reglex facility and a ready made
lexer in the library. There's ALSO a GLR parser but it's been
adandoned because it's too hard to keep in sync with the
Ocamlyacc parser. One day we'll bootstrap.
> Now back to fun stuff: I tried (again) to figure out the problem of
> using the new typecase ... endcase construction to produce generic
> functions over types. First, I ran into another parsing problem I
> don't know the answer to:
>
>
> fun gmapT_one[A,B,T] (f:A->B, x:T): T =
> {
> return ({typecase[c,a] c a => c ({f a;}) endcase x;});
> }
>
> returns this error:
>
> CLIENT ERROR
> [rex] Unexpected typecase [c,a] (c a) => (c (fun () = {
> call noexpand(f) a;
> }))endcase
>
> --I tried a bunch of different variations so if this seems messy it
> is the closest parsable construction I could come up with.
Yeah, you miss a 'minor' detail here: pattern calculus is only
implemented for *type expressions*. That's why my examples
look like
var x : typecase .. endcase (int + long) = 1;
.. the typecase is a *type* case, applied to types.
The reason: it's experimental, and the implementation for type
expressions is done entirely at compile time (since there is
no dependent typing).
If it works, we might think about doing a version for
executable expressions and patterns. However dynamically
generated matches will require at least a complete rewrite
of the pattern matching, because at present the match
facility is eliminated entirely during syntactic desugaring.
Constant patterns only support two match combinators:
variants and tuples, which are easy to desugar.
To handle pattern calculus for expressions we'd need
a bound pattern construction, all we actually have
is combinators for tuple projections, and for comparing
variant tags and extracting variant arguments (which
together are enough).
When you think about, for example, path polymorphic searching,
it would seem some of the pattern calculus operations would
have to be done at run time.. which means a run time representation
of patterns .. this sounds hard compared to calculations with
Ocaml combinators at compile time .. :)
I could be wrong though.
> (2) without type classes (or some other construction, such as a
> class), you need to wrap typecase in a function in order to handle
> the overload resolution for recursively applying typecase
> deconstructors over a type. This may be a subtle point.
Well typecase only works on types at the moment. Type functions
can also be overloaded on argument kind, but I glossed over that,
and just required TYPE that is:
typecase[c,a] c a => a
would never work, because the notation is short for
typecase[c:TYPE, a:TYPE] c a = > a endcase
Luckily, this isn't a problem, because there are no
variants in the type system, so no instances of 'c'
above can be constructed. A type function, eg:
typedef fun pair(a:TYPE):TYPE = a * a;
is a function, NOT a type constructor, that is, in
the type expression:
pair t
pair is actually a free variable bound to the definition
of 'pair' above, and 'pair' is actually replaced during
lookup by a lambda terms. By the time the pattern
calculator sees this expression it looks like:
apply ( (lambda t'. t' * t'), t )
that is, no constructors in sight. The only constructors
are actually 'int', 'long' etc. Well technically in:
type vector[t] = "std::vector<?1>";
'vector' is a type constructor with an argument, but
it isn't applied: vector[int] is actually a single name.
To get real constructors into the type system we'd need:
typeunion X = Left of TYPE | Right of TYPE;
for example. This is a union of categories, not types,
just as
TYPE * TYPE
is a product of categories.
The bottom line is that pattern calculus for expressions
would have to be augmented to:
exprcase [a:int, b: long] Some (a,b) => a, b endcase
for example (where of course 'int' and 'long' might be
type variables. in other words, it would have to be typed
pattern calculus.
In your example you use map. This will NOT work with
overloading. Felix does not permit overloads based on
instantiation of type variables: there's no dependent
name lookup.
fun f[t](x:t)=> add(x,x);
will never work, not even with
f[int](2)
.. unless of course you have a polymorphic add routine:
fun add[t](x:t)=> ...
In other words if you want to apply a map polymorphic in
the data functor (eg list, array) it HAS to be polyadic.
The whole point of the pattern calculus is that it
actually lets you define a polyadic map, fold etc:
Haskell cannot do this.
In fact .. buried in Felix sources is some code which
attempts to make 'map' a compiler primitive, which
is fully polyadic: give a type like
list = 1 + t * list
map is generated by the compiler .. however it's a hack,
because it relies on the type variable 't' to identify
the 'hole' in the data structure to be mapped.
--
John Skaller <skaller at users dot sf dot net>
Felix, successor to C++: http://felix.sf.net
-------------------------------------------------------------------------
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642
_______________________________________________
Felix-language mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/felix-language