Am Mon, 28 Jan 2013 01:53:02 +0100
schrieb "Brian Schott" <[email protected]>:

> 
> The bottleneck in std.d.lexer as it stands is the appender 
> instances that assemble Token.value during iteration and front() 
> on the array of char[]. (As I'm sure everyone expected)

This sounds like a valid use case for buffered ranges (which we don't
have yet, afaik). When used correctly you could avoid the appender
completely. Instead slice the input buffer and copy it if necessary.
(If the complete file is kept in memory copying could also be
avoided) 

Reply via email to