Howdy, I'm trying to parse numbers using lexer states, and there aren't any explicit ending delimiters. Instead, I was hoping to rely on the t_*_error functions to wrap up my token.
It turns out, though, that the error handler will raise an error if the lexpos doesn't change. So I can't simply let t_error find the next non-matching character, wrap up my token, and change parser state back to something else. Is this a bug? If not, is there a generally accepted way to deal with the case where a state doesn't end because of a visible delimiter, but simply because it transitions somewhere else? aTdHvAaNnKcSe, =Austin -- You received this message because you are subscribed to the Google Groups "ply-hack" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/ply-hack/65133dba-a541-49fb-9757-c39ee39903c9%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
