On Mon, Mar 15, 2010 at 7:25 PM, Chris Hostetter
<hossman_luc...@fucit.org> wrote:

> Hmmm... I'm not sure i understand how any declared CharFilter/TOkenizer
> combo will be able to deal with this any better, but i'll take your word
> for it.

you can see this behavior in SolrAnalyzer's reusableTokenStream
method, it re-uses the Tokenizer but wraps the readers with
charStream() [overridden by TokenizerChain to wrap the Reader with
your CharFilter chain].

  @Override
  public TokenStream reusableTokenStream(String fieldName, Reader
reader) throws IOException {
    // if (true) return tokenStream(fieldName, reader);
    TokenStreamInfo tsi = (TokenStreamInfo)getPreviousTokenStream();
    if (tsi != null) {
      tsi.getTokenizer().reset(charStream(reader)); // <-- right here


>
> Kill it then, and we'll just have to start making a list in the
> "Upgrading" section of CHANGES.txt noting the "recommended" upgrad path
> for this (and many, many things to come i imagine)
>

cool, I'll add some additional verbage to the CHANGES in the branch.



-- 
Robert Muir
rcm...@gmail.com

Reply via email to