: My real use case is adding the the trim filter to the pattern tokenizer.
: the 'correct' answer in my case it to update the offsets.

hmmm... wouldn't the "correct" thing to do in that case be to change your
pattern so it strips the whitespace when tokenizing?  that way the offsets
of your tokens will be accurate from the begining.



-Hoss

Reply via email to