Hi Torsten,

did you have a look at WordDelimiterTokenFilter?

Sounds like it fits your needs.

Regards,
Em

Am 17.02.2012 15:14, schrieb Torsten Krah:
> Hi,
> 
> is it possible to extend the standard tokenizer or use a custom one
> (possible via extending the standard one) to add some "custom" tokens
> like Lucene-Core to be "one" token.
> 
> regards

Reply via email to