On Mar 10, 8:53 pm, robert.mull...@gmail.com wrote:
> I understand the method, but when you say you "count one DEDENT for
> each level"
> well lets say you counted 3 of them. Do you have a way to interject 3
> consecutive
> DEDENT tokens into the token stream so that the parser receives them
> befo
robert.mull...@gmail.com wrote:
> I understand the method, but when you say you "count one DEDENT for
> each level"
> well lets say you counted 3 of them. Do you have a way to interject 3
> consecutive
> DEDENT tokens into the token stream so that the parser receives them
> before it
> receives the
On Mar 10, 9:38 pm, Paul McGuire wrote:
> On Mar 10, 8:31 pm, robert.mull...@gmail.com wrote:
>
>
>
> > I am trying to implement a lexer and parser for a subset of python
> > using lexer and parser generators. (It doesn't matter, but I happen to
> > be using
> > ocamllex and ocamlyacc). I've run i
On Mar 10, 8:31 pm, robert.mull...@gmail.com wrote:
> I am trying to implement a lexer and parser for a subset of python
> using lexer and parser generators. (It doesn't matter, but I happen to
> be using
> ocamllex and ocamlyacc). I've run into the following annoying problem
> and hoping someone c
I am trying to implement a lexer and parser for a subset of python
using lexer and parser generators. (It doesn't matter, but I happen to
be using
ocamllex and ocamlyacc). I've run into the following annoying problem
and hoping someone can tell me what I'm missing. Lexers generated by
such tools re