See below:

On Sun, Sep 25, 2011 at 9:53 AM, Jithin <jithin1...@gmail.com> wrote:
> Hi Erick, The problem I am trying to solve is to filter invalid entities.
> Users might mispell or enter a new entity name. This new/invalid entities
> need to pass through a KeepWordFilter so that it won't pollute our
> autocomplete result.
>

Right. But if you have a KeepWordFilter, that implies that you have a list
of known good words. Couldn't you use that file as your base for the
autosuggest component?

> I was looking into Luke. And it does seem to solve my use case, but is Luke
> something I can use in a production setup?

You'll find the performance unacceptably slow if you tried to do something
similar in production. The nature of an inverted index  makes reconstructing
a document from the various terms costly.


> Also when does <copyField> happens? Is the data being copied a result of
> application of all filters or unmodified one?

<copyField> happens to the raw input, not the result of your
analysis chain. And you can't chain <copyField> directives,
i.e.
<copyField source="field1" dest="field2" />
<copyField source="field2" dest="field3" />

would not put the contents of "field1" into "field3"

Best
Erick

>
> --
> View this message in context: 
> http://lucene.472066.n3.nabble.com/How-to-apply-filters-to-stored-data-tp3366230p3366987.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>

Reply via email to