Hello,

I'm developing a project where I'm using elasticsearch to index files (pdf,
doc, txt...). I have to index the content of this files, and they're written
in different languages. 

My concern is about the stopwords filter, I've used it before, but with one
language index, so I didn't have any problems but now I'm going to have from
7 to 12 languages (English, French, Spanish, Galician, German...) I made
some research and didn't find any relevant information. 

My questions are:

- There is any kind of stopwords filter for multi-language purpose ?
- Should I use several stopwords filter ? (Doesn't seem optimal to me)
- Should I have 1 index for each language so each index has different
mapping ?
- Maybe mixing the stopwords filter for all the languages I need to index ?

I hope someone has faced this issue before and can point me to a succesfull
solution.

Thanks in advance



-----
I know that I know nothing.
--
View this message in context: 
http://elasticsearch-users.115913.n3.nabble.com/MultiLanguage-Index-StopWords-tp4063249.html
Sent from the ElasticSearch Users mailing list archive at Nabble.com.

-- 
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/elasticsearch/1410345623216-4063249.post%40n3.nabble.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to