On 1/27/2013 12:15 PM, Philippe Sigaud wrote:
On Sun, Jan 27, 2013 at 8:48 PM, Walter Bright
<[email protected]> wrote:
On 1/27/2013 2:17 AM, Philippe Sigaud wrote:
Walter seems to think if a lexer is not able to vomit thousands
of tokens a seconds, then it's not good.
Speed is critical for a lexer.
Something I never tought about: given a 10k lines module, how many
tokens does that represent, roughly. Ten times as much?
I don't know.
This means, for example, you'll need to squeeze pretty much all storage
allocation out of it. A lexer that does an allocation per token will is not
going to do very well at all.
How does one do that? Honest question, I'm not really concerned by
extreme speed, most of the time, and have lots to learn in this area.
Just study the dmd lexer.c source code.