>You assumed that \s will delimit the tokens. That's not the case (see
>the original message, the interesting data can occur anywhere). So you
>can't tokenize and do a simple hash lookup. If you benchmark 6000
Acutally I believe the OP said that there were still delimters required,
they just weren't \s so one CAN still tokenize
Quoth Kripa on Fri, 4 Feb 2011 19:53:35 -0500
Unfortunately, my names can be embedded in larger "words" of the input
text, as long as they are delimited by certain punctuation.
_______________________________________________
Boston-pm mailing list
[email protected]
http://mail.pm.org/mailman/listinfo/boston-pm