|
Il 26/01/2015 16:37, Alessandro
Bonfanti ha scritto:
Il 21/01/2015 11:43, Alessandro
Bonfanti ha scritto:
Il 02/12/2014 09:21, Alessandro
Bonfanti ha scritto:
Il 12/11/2014 17:43, Alessandro
Bonfanti ha scritto:
Il 12/11/2014 17:20, Nikolas
Everett ha scritto:
--
You received this message because you are subscribed to a
topic in the Google Groups "elasticsearch" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/elasticsearch/Y6I2qNZxR-s/unsubscribe.
To unsubscribe from this group and all its topics, send an
email to [email protected].
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAPmjWd0itbdHQ-maOuOmrrYf2QCqMFORTG21QpFHOCrp9E0rmg%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.
This is my query (in Ruby):
@client.search index: @index, body: {query: {wildcard: {_all: query_text}}}
Variables' name should be auto-explicative of its content.
I read that wildcards are slower, if you have a more clean
solution (I need anyway that I still can search for
"linc-ZNF6092" in addiction for "ZNF6092") it will be very
welcome.
I have tried a lot of attempts, but the problem still resist.
Maybe could it be caused by another setting than analyzer?
Definitely, I need a step-to-step method for disabling the
analyzer or set it to 'keyword' on all fields of an index. I
tried a lot of attempts but no-one seems to work.
This situation cause me much problems, I need that ES do not
tokenize my literal strings, why there isn't a clear method to
switch of it?
Thanks everyones.
OK, after a lot of attempts I can finally set analyezer to
'keyword' for default. I do this with:
@es_client.indices.create index: "test", body: { "index" => { "analysis" => { "analyzer" => { "default" => { "type" => "keyword" }}}}}
Now I have solved some problems, I finally can do exact matching
stuff with 'term' query, for example on a path
'/home/data/foo.bar' or on a gene-id 'ENSG00000186092'.
The bad things are that problems with 'query_string' even worsen.
It seems that query_string can't work with not analyzed fields.
If I try a trivial:
@es_client.search index: "test", body: {"query" => { "query_string" => { "query" => "ENSG00000186092" }}}
Nothing works (0 results found). Text hasn't spaces or other
special characters that could create problems with tokenization.
So what's the problem?
Can a solution be the use of a 'fake' pattern tokenizer with
pattern "$^" (this should create a non-matchable pattern, with
result alike the 'keyword' analyzer)?
Any other idea will be very appreciated.
Problems with search derived probably by the fact that query_string
automatically make lovercased all words. It's behavior caused by
'lowercase' filter automatically inserted.
I can't find on the web any examples about setting of
analyzers/tokenizers/filters via ruby APIs. The only one that seems
to work well is the pulled over method for set the default analyzer
when a new index is created. Any suggestion?
I need valid method for set them in custom fields/searches etc.
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [email protected].
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/54C8B2C6.90300%40gmail.com.
For more options, visit https://groups.google.com/d/optout.
|