> Hi Nicolas,
> 
> Would you mind if I enquired as to which Parser / Scanner generator you
> used for Neko?  I've taken a look at Cocol/r and AntLR and both look
> good, but I'd like the advise of someone experienced.
> 
> Thanks,
> Lee

NekoML have its own lexer and parser.

- the Neko lexer (see src/neko/Lexer.nml) is a list of pairs (regexp
string, function). The Regexp are compiled and executed using pure
NekoML code (see src/core/Lexer.nml and src/core/LexEngine.nml).

Basicly it parse at runtime the regexp string, translate it into a NFA
then transform the NFA into a DFA and expand the char tables for fast
lookup. Performances are correct (startup is quite low), but don't
compare to a C generated and compiled code of course.

- there is also builtin stream parsers in NekoML that can be used to
implement LL(1) parser (see src/neko/Parser.nml). Basicly you match a
stream of tokens that comes from a lexer and you execute the
corresponding rules to build the AST.

Stream parsers are quite powerful since they can use ML pattern
matching. The NekoML implementation is actually LL(k) since you can
match several tokens in a given rule as long as you don't enter a sub-rule.

Both are not most optimized implementations, in particular because they
are pure Neko/NekoML. OTOH they don't need any C additional library and
the resulting code is quite small.

Nicolas

-- 
Neko : One VM to run them all
(http://nekovm.org)

Reply via email to