Re: More Robust Search Timeouts (to Kill Zombie Queries)?
I got responses, but no easy solution to allow me to directly cancel a request. The responses did point to: - timeAllowed query parameter that returns partial results - https://cwiki.apache.org/confluence/display/solr/Common+Query+Parameters#CommonQueryParameters-ThetimeAllowedParameter - A possible hack that I never followed through - http://mail-archives.apache.org/mod_mbox/lucene-solr-user/201401.mbox/%3CCANGii8eaSouePGxa7JfvOBhrnJUL++Ct4rQha2pxMefvaWhH=g...@mail.gmail.com%3E Maybe one of those will help you? If they do, make sure to report back! -Luis On Tue, Apr 1, 2014 at 3:13 AM, Salman Akram salman.ak...@northbaysolutions.net wrote: So you too never got any response... On Mon, Mar 31, 2014 at 6:57 PM, Luis Lebolo luis.leb...@gmail.com wrote: Hi Salman, I was interested in something similar, take a look at the following thread: http://mail-archives.apache.org/mod_mbox/lucene-solr-user/201401.mbox/%3CCADSoL-i04aYrsOo2%3DGcaFqsQ3mViF%2Bhn24ArDtT%3D7kpALtVHzA%40mail.gmail.com%3E#archives I never followed through, however. -Luis On Mon, Mar 31, 2014 at 6:24 AM, Salman Akram salman.ak...@northbaysolutions.net wrote: Anyone? On Wed, Mar 26, 2014 at 7:55 PM, Salman Akram salman.ak...@northbaysolutions.net wrote: With reference to this thread http://mail-archives.apache.org/mod_mbox/lucene-solr-user/200903.mbox/%3c856ac15f0903272054q2dbdbd19kea3c5ba9e105b...@mail.gmail.com%3E I wanted to know if there was any response to that or if Chris Harris himself can comment on what he ended up doing, that would be great! -- Regards, Salman Akram -- Regards, Salman Akram -- Regards, Salman Akram
Re: More Robust Search Timeouts (to Kill Zombie Queries)?
Hi Salman, I was interested in something similar, take a look at the following thread: http://mail-archives.apache.org/mod_mbox/lucene-solr-user/201401.mbox/%3CCADSoL-i04aYrsOo2%3DGcaFqsQ3mViF%2Bhn24ArDtT%3D7kpALtVHzA%40mail.gmail.com%3E#archives I never followed through, however. -Luis On Mon, Mar 31, 2014 at 6:24 AM, Salman Akram salman.ak...@northbaysolutions.net wrote: Anyone? On Wed, Mar 26, 2014 at 7:55 PM, Salman Akram salman.ak...@northbaysolutions.net wrote: With reference to this thread http://mail-archives.apache.org/mod_mbox/lucene-solr-user/200903.mbox/%3c856ac15f0903272054q2dbdbd19kea3c5ba9e105b...@mail.gmail.com%3EI wanted to know if there was any response to that or if Chris Harris himself can comment on what he ended up doing, that would be great! -- Regards, Salman Akram -- Regards, Salman Akram
Re: Problem querying large StrField?
Hi Yonik, Thanks for the response. Our use case is perhaps a little unusual. The actual domain is in bioinformatics, but I'll try to generalize. We have two types of entities, call them A's and B's. For a given pair of entities (a_i, b_j) we may or may not have an associated data value z. Standard many to many stuff in a DB. Users can select an arbitrary set of entities from A. What we'd then like to ask of Solr is: Which entities of type B have a data value for any of the A's I've selected. The way we've approached this to date is to index the set of B, such that each document has a multivalued field containing the id's of all entities A that have a data value. If I select a set of A (a1, a2, a5, a9), then I would query data availability across B as dataAvailabilityField:(a1 OR a2 OR a5 OR a9). The sets of A and B are fairly large (~10 - 30k). This was working ok, but our datasets have increased and now the giant OR is getting too slow. As an alternative approach, we developed a ValueParser plugin that took advantage of our ability to sort the list of entity id's and do some clever things, like binary searches and short circuits on the results. For this to work, we concatenated all the id's into a single comma delimited value. So the data availability field is now single valued, but has a term that looks like a1,a3,a6,a7. Our function query then takes the list of A id's that we're interested in and searches the documents for ones that match any value. Worked great and quite fast when the id list was short enough. But then we tried it on the full data set and the indexed terms of id's are HUGE. I know it's a bit of an odd use case, but have you seen anything like this before? Do you have any thoughts on how we might better accomplish this functionality? Thanks! On Wed, Feb 5, 2014 at 1:42 PM, Yonik Seeley yo...@heliosearch.com wrote: On Wed, Feb 5, 2014 at 1:04 PM, Luis Lebolo luis.leb...@gmail.com wrote: Update: It seems I get the bad behavior (no documents returned) when the length of a value in the StrField is greater than or equal to 32,767 (2^15). Is this some type of bit overflow somewhere? I believe that's the maximum size of an indexed token. Can you share your use-case? Why are you trying to index such large values as a single token? -Yonik http://heliosearch.org - native off-heap filters and fieldcache for solr
Problem querying large StrField?
Hi All, It seems that I can't query on a StrField with a large value (say 70k characters). I have a Solr document with a string type: fieldType name=string class=solr.StrField sortMissingLast=true/ and field: dynamicField name=someFieldName_* type=string indexed=true stored=true / Note that it's stored, in case that matters. Across my documents, the length of the value in this StrField can be up to ~70k characters or more. The query I'm trying is 'someFieldName_1:*'. If someFieldName_1 has values with length ~10k characters, then it works fine and I retrieve various documents with values in that field. However, if I query 'someFieldName_2:*' and someFieldName_2 has values with length ~60k, I don't get back any documents. Even though I *know* that many documents have a value in someFieldName_2. If I query *:* and add someFieldName_2 in the field list, I am able to see the (large) value in someFieldName_2. So is there some type of limit to the length of strings in StrField that I can query against? Thanks, Luis
Re: Problem querying large StrField?
Update: It seems I get the bad behavior (no documents returned) when the length of a value in the StrField is greater than or equal to 32,767 (2^15). Is this some type of bit overflow somewhere? On Wed, Feb 5, 2014 at 12:32 PM, Luis Lebolo luis.leb...@gmail.com wrote: Hi All, It seems that I can't query on a StrField with a large value (say 70k characters). I have a Solr document with a string type: fieldType name=string class=solr.StrField sortMissingLast=true/ and field: dynamicField name=someFieldName_* type=string indexed=true stored=true / Note that it's stored, in case that matters. Across my documents, the length of the value in this StrField can be up to ~70k characters or more. The query I'm trying is 'someFieldName_1:*'. If someFieldName_1 has values with length ~10k characters, then it works fine and I retrieve various documents with values in that field. However, if I query 'someFieldName_2:*' and someFieldName_2 has values with length ~60k, I don't get back any documents. Even though I *know* that many documents have a value in someFieldName_2. If I query *:* and add someFieldName_2 in the field list, I am able to see the (large) value in someFieldName_2. So is there some type of limit to the length of strings in StrField that I can query against? Thanks, Luis
Cancel Solr query?
Hi All, Is it possible to cancel a Solr query/request currently in progress? Suppose the user starts searching for something (that takes a long time for Solr to process), then decides the modify the query. I can simply ignore the previous request and create a new request, but Solr is still processing the old request, correct? Is there any way to cancel that first request? Thanks, Luis
Re: Facet field query on subset of documents
Hi Erick, Thanks for the reply and sorry, my fault, wasn't clear enough. I was wondering if there was a way to remove terms that would always be zero (because the term came from a document that didn't match the filter query). Here's an example. I have a bunch of documents with fields 'manufacturer' and 'location'. If I set my filter query to manufacturer = Sony and all Sony documents had a value of 'Florida' for location, then I want 'Florida' NOT to show up in my facet field results. Instead, it shows up with a count of zero (and it'll always be zero because of my filter query). Using mincount = 1 doesn't solve my problem because I don't want it to hide zeroes that came from documents that actually pass my filter query. Does that make more sense? On Thu, Nov 21, 2013 at 4:36 PM, Erick Erickson erickerick...@gmail.comwrote: That's what faceting does. The facets are only tabulated for documents that satisfy they query, including all of the filter queries and anh other criteria. Otherwise, facet counts would be the same no matter what the query was. Or I'm completely misunderstanding your question... Best, Erick On Thu, Nov 21, 2013 at 4:22 PM, Luis Lebolo luis.leb...@gmail.com wrote: Hi All, Is it possible to perform a facet field query on a subset of documents (the subset being defined via a filter query for instance)? I understand that facet pivoting might work, but it would require that the subset be defined by some field hierarchy, e.g. manufacturer - price (then only look at the results for the manufacturer I'm interested in). What if I wanted to define a more complex subset (where the name starts with A but ends with Z and some other field is greater than 5 and yet another field is not 'x', etc.)? Ideally I would then define a facet field constraining query to include only terms from documents that pass this query. Thanks, Luis
Facet field query on subset of documents
Hi All, Is it possible to perform a facet field query on a subset of documents (the subset being defined via a filter query for instance)? I understand that facet pivoting might work, but it would require that the subset be defined by some field hierarchy, e.g. manufacturer - price (then only look at the results for the manufacturer I'm interested in). What if I wanted to define a more complex subset (where the name starts with A but ends with Z and some other field is greater than 5 and yet another field is not 'x', etc.)? Ideally I would then define a facet field constraining query to include only terms from documents that pass this query. Thanks, Luis
Re: CachedSqlEntityProcessor not adding fields
I'm noticing some very odd behavior using dataimport from the Admin UI. Whenever I limit the number of rows to 75 or below, the aliases field never gets populated. As soon as I increase the limit to 76 or more, the aliases field gets populated! What am I not understanding here? On Tue, Jul 30, 2013 at 11:04 AM, Luis Lebolo luis.leb...@gmail.com wrote: Hi All, I'm trying to use CachedSqlEntityProcessor in one of my sub-entities, but the field never gets populated. I'm using Solr 4.4. The field is a multi-valued field: The relevant part of my data-config.xml looks like: !-- This is the root entitiy -- entity name=model query=select * from models field column=MODEL_ID name=id / field column=MODEL_NAME name=name / field column=GENDER name=gender / field column=DESCRIPTION name=description / entity name=entity query=select e.ENTITY_TYPE_ID, et.ENTITY_TYPE from entities e inner join entity_types et on et.ENTITY_TYPE_ID = e.ENTITY_TYPE_ID where e.ENTITY_ID = '${model.MODEL_ID}' field column=ENTITY_TYPE_ID name=entityTypeId / field column=ENTITY_TYPE name=entityType / /entity entity name=alias query=select MODEL_ID as ALIAS_MODEL_ID, ALIAS_NAME from model_aliases processor=CachedSqlEntityProcessor cacheKey=ALIAS_MODEL_ID cacheLookup=model.MODEL_ID field column=ALIAS_NAME name=aliases / /entity ... /entity Let me know if you need more info. Any ideas appreciated! Thanks, Luis
Re: SOLR online reference document - WIKI
This page never came up on any of my Google searches, so thanks for the heads up! Looks good. -Luis On Tue, Jun 25, 2013 at 12:32 PM, Learner bbar...@gmail.com wrote: I just came across a wonderful online reference wiki for SOLR and thought of sharing it with the community.. https://cwiki.apache.org/confluence/display/solr/Apache+Solr+Reference+Guide -- View this message in context: http://lucene.472066.n3.nabble.com/SOLR-online-reference-document-WIKI-tp4073110.html Sent from the Solr - User mailing list archive at Nabble.com.
SolrDocument getFieldNames() exclude dynamic fields?
Hi All, I'm using SolrJ's QueryResponse to retrieve all SolrDocuments from a query. When I use SolrDocument's getFieldNames(), I get back a list of fields that excludes dynamic fields (even though I know they are not empty). Is there a way to get a list of all fields for a given SolrDocument? Thanks, Luis
Re: SolrDocument getFieldNames() exclude dynamic fields?
Apologies, I wasn't storing these dynamic fields. On Fri, Apr 26, 2013 at 11:01 AM, Luis Lebolo luis.leb...@gmail.com wrote: Hi All, I'm using SolrJ's QueryResponse to retrieve all SolrDocuments from a query. When I use SolrDocument's getFieldNames(), I get back a list of fields that excludes dynamic fields (even though I know they are not empty). Is there a way to get a list of all fields for a given SolrDocument? Thanks, Luis
SolrJ Custom RowMapper
Hi All, Does SolrJ have an option for a custom RowMapper or BeanPropertyRowMapper (I'm using Spring/JDBC terms). I know the QueryResponse has a getBeans method, but I would like to create my own mapping and plug it in. Any pointers? Thanks, Luis
Re: SolrException parsing error
Turns out I spoke too soon. I was *not* sending the query via POST. Changing the method to POST solved the issue for me (maybe I was hitting a GET limit somewhere?). -Luis On Tue, Apr 16, 2013 at 7:38 AM, Marc des Garets m...@ttux.net wrote: Did you find anything? I have the same problem but it's on update requests only. The error comes from the solrj client indeed. It is solrj logging this error. There is nothing in solr itself and it does the update correctly. It's fairly small simple documents being updated. On 04/15/2013 07:49 PM, Shawn Heisey wrote: On 4/15/2013 9:47 AM, Luis Lebolo wrote: Hi All, I'm using Solr 4.1 and am receiving an org.apache.solr.common.** SolrException parsing error with root cause java.io.EOFException (see below for stack trace). The query I'm performing is long/complex and I wonder if its size is causing the issue? I am querying via POST through SolrJ. The query (fq) itself is ~20,000 characters long in the form of: fq=(mutation_prot_mt_1_1:2374 + OR + mutation_prot_mt_2_1:2374 + OR + mutation_prot_mt_3_1:2374 + ...) + OR + (mutation_prot_mt_1_2:2374 + OR + mutation_prot_mt_2_2:2374 + OR + mutation_prot_mt_3_2:2374+...) + OR + ... In short, I am querying for an ID throughout multiple dynamically created fields (mutation_prot_mt_#_#). Any thoughts on how to further debug? Thanks in advance, Luis --** SEVERE: Servlet.service() for servlet [X] in context with path [/x] threw exception [Request processing failed; nested exception is org.apache.solr.common.**SolrException: parsing error] with root cause java.io.EOFException at org.apache.solr.common.util.**FastInputStream.readByte(**FastInputStream.java:193) at org.apache.solr.common.util.**JavaBinCodec.unmarshal(** JavaBinCodec.java:107) at org.apache.solr.client.solrj.**impl.BinaryResponseParser.** processResponse(**BinaryResponseParser.java:41) at org.apache.solr.client.solrj.**impl.HttpSolrServer.request(**HttpSolrServer.java:387) at org.apache.solr.client.solrj.**impl.HttpSolrServer.request(**HttpSolrServer.java:181) at org.apache.solr.client.solrj.**request.QueryRequest.process(**QueryRequest.java:90) at org.apache.solr.client.solrj.**SolrServer.query(SolrServer.** java:301) I am guessing that this log is coming from your SolrJ client, but That is not completely clear, so is it SolrJ or Solr that is logging this error? If it's SolrJ, do you see anything in the Solr log, and vice versa? This looks to me like a network problem, where something is dropping the connection before transfer is complete. It could be an unusual server-side config, OS problems, timeout settings in the SolrJ code, NIC drivers/firmware, bad cables, bad network hardware, etc. Thanks, Shawn
SolrException parsing error
Hi All, I'm using Solr 4.1 and am receiving an org.apache.solr.common.SolrException parsing error with root cause java.io.EOFException (see below for stack trace). The query I'm performing is long/complex and I wonder if its size is causing the issue? I am querying via POST through SolrJ. The query (fq) itself is ~20,000 characters long in the form of: fq=(mutation_prot_mt_1_1:2374 + OR + mutation_prot_mt_2_1:2374 + OR + mutation_prot_mt_3_1:2374 + ...) + OR + (mutation_prot_mt_1_2:2374 + OR + mutation_prot_mt_2_2:2374 + OR + mutation_prot_mt_3_2:2374+...) + OR + ... In short, I am querying for an ID throughout multiple dynamically created fields (mutation_prot_mt_#_#). Any thoughts on how to further debug? Thanks in advance, Luis -- SEVERE: Servlet.service() for servlet [X] in context with path [/x] threw exception [Request processing failed; nested exception is org.apache.solr.common.SolrException: parsing error] with root cause java.io.EOFException at org.apache.solr.common.util.FastInputStream.readByte(FastInputStream.java:193) at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:107) at org.apache.solr.client.solrj.impl.BinaryResponseParser.processResponse(BinaryResponseParser.java:41) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:387) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:181) at org.apache.solr.client.solrj.request.QueryRequest.process(QueryRequest.java:90) at org.apache.solr.client.solrj.SolrServer.query(SolrServer.java:301) at x.x.x.x.x.x.someMethod(x.java:111) at x.x.x.x.x.x.otherMethod(x.java:222) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:213) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:126) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:96) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:617) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:578) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:80) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:923) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:852) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:778) at javax.servlet.http.HttpServlet.service(HttpServlet.java:621) at javax.servlet.http.HttpServlet.service(HttpServlet.java:722) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) at x.x.x.x.x.yetAnotherMethod(x.java:333) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.rememberme.RememberMeAuthenticationFilter.doFilter(RememberMeAuthenticationFilter.java:146) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at
Re: Query Parser OR AND and NOT
What if you try city:(*:* -H*) OR zip:30* Sometimes Solr requires a list of documents to subtract from (think of *:* -someQuery converts to all documents without someQuery). You can also try looking at your query with debugQuery = true. -Luis On Mon, Apr 15, 2013 at 12:25 PM, Peter Schütt newsgro...@pstt.de wrote: Hallo, Roman Chyla roman.ch...@gmail.com wrote in news:caen8dywjrl+e3b0hpc9ntlmjtrkasrqlvkzhkqxopmlhhfn...@mail.gmail.com: should be: -city:H* OR zip:30* -city:H* OR zip:30* numFound:2520 gives the same wrong result. Another Idea? Ciao Peter Schütt
Re: SolrException parsing error [Solved]
Sorry, spoke to soon. Turns out I was not sending the query via POST. Changing the method to POST solved the issue. Apologies for the spam! -Luis On Mon, Apr 15, 2013 at 11:47 AM, Luis Lebolo luis.leb...@gmail.com wrote: Hi All, I'm using Solr 4.1 and am receiving an org.apache.solr.common.SolrException parsing error with root cause java.io.EOFException (see below for stack trace). The query I'm performing is long/complex and I wonder if its size is causing the issue? I am querying via POST through SolrJ. The query (fq) itself is ~20,000 characters long in the form of: fq=(mutation_prot_mt_1_1:2374 + OR + mutation_prot_mt_2_1:2374 + OR + mutation_prot_mt_3_1:2374 + ...) + OR + (mutation_prot_mt_1_2:2374 + OR + mutation_prot_mt_2_2:2374 + OR + mutation_prot_mt_3_2:2374+...) + OR + ... In short, I am querying for an ID throughout multiple dynamically created fields (mutation_prot_mt_#_#). Any thoughts on how to further debug? Thanks in advance, Luis -- SEVERE: Servlet.service() for servlet [X] in context with path [/x] threw exception [Request processing failed; nested exception is org.apache.solr.common.SolrException: parsing error] with root cause java.io.EOFException at org.apache.solr.common.util.FastInputStream.readByte(FastInputStream.java:193) at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:107) at org.apache.solr.client.solrj.impl.BinaryResponseParser.processResponse(BinaryResponseParser.java:41) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:387) at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:181) at org.apache.solr.client.solrj.request.QueryRequest.process(QueryRequest.java:90) at org.apache.solr.client.solrj.SolrServer.query(SolrServer.java:301) at x.x.x.x.x.x.someMethod(x.java:111) at x.x.x.x.x.x.otherMethod(x.java:222) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:213) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:126) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:96) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:617) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:578) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:80) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:923) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:852) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:778) at javax.servlet.http.HttpServlet.service(HttpServlet.java:621) at javax.servlet.http.HttpServlet.service(HttpServlet.java:722) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) at x.x.x.x.x.yetAnotherMethod(x.java:333) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:118) at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:84) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:113) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:113) at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:342) at org.springframework.security.web.authentication.rememberme.RememberMeAuthenticationFilter.doFilter(RememberMeAuthenticationFilter.java:146) at org.springframework.security.web.FilterChainProxy
Query on all dynamic fields or wildcard field query
Hi All, First I have to apologize and admit that I'm asking this question before doing any real research =( Was hoping for some preliminary help before I start this endeavor tomorrow. So here goes: Can I query for a value in multiple (wildcarded) fields? For example, if I have dynamic fields fieldName_someToken (e.g. fieldName_1, fieldName_2, fieldName_3), can I construct a query like fieldName_*:someValue? The query itself doesn't work, but is there a way to query numerous dynamic fields without explicitly listing them? Thanks, Luis
Querying for ~2000 integers - better model?
Hello! First time poster so {insert ignorance disclaimer here ;)}. I'm building a web application backed by an Oracle database and we're using Lucene Solr to index various lists of entities (via DIH). We then harness Solr's faceting to allow the user to filter through their searches. One aspect we're having trouble modeling is the concept of data availability. A dataset will have a data value for various entity pairs. To generalize, say we have two entities: Apples and Oranges. Therefore, there's a data value for various Apple and Orange pairs (e.g. apple1 orange5 have value 6.566). The question we want to model is which Apples have data for a specific set of Oranges. The problem is that the list of Oranges can be ~2000. Our first (and albeit ugly) approach was to create a dataAvailability field in each Apple document. It's a multi-valued field that holds a list of Oranges (actually a list of Orange IDs) that have data for that specific Apple. Our facet query then becomes ...facet.query=dataAvailability:(1 OR 2 OR 4 OR 45 OR 200 OR ...)... For 1000 Oranges, the query takes a long time to run the first time a user performs it (afterwards it gets cached so it runs fairly quickly). Any thoughts on how to speed this up? Is there a better model to use? One idea was to use the autowarming features. However, the list of Oranges will always be dynamically built by the user (and it's not feasible to autowarm all possible permutations of ~2000 Oranges =)). Hope the generalization isn't too stupid, and thanks in advance! Cheers, Luis
Re: Querying for ~2000 integers - better model?
Hi Mikhail, Thanks for the interest! The user selects various Oranges from the website. The list of Orange IDs then gets placed into a table in our database. For example, the user may want to search oranges from Florida (a state filter) planted a week ago (a data filter). We then display 600 Oranges that fit this query and the user says select them all. We then store all 600 IDs in our database. For the data availability filter, we get the list of Orange IDs from the database first then use SolrJ to create the facet query. -Luis On Tue, Feb 5, 2013 at 12:03 PM, Mikhail Khludnev mkhlud...@griddynamics.com wrote: Hello Luis, Your problem seems fairly obvious (hard to solve problem). Where these set of orange id come from? Does an user enter thousand of these ids into web-form? On Tue, Feb 5, 2013 at 8:49 PM, Luis Lebolo luis.leb...@gmail.com wrote: Hello! First time poster so {insert ignorance disclaimer here ;)}. I'm building a web application backed by an Oracle database and we're using Lucene Solr to index various lists of entities (via DIH). We then harness Solr's faceting to allow the user to filter through their searches. One aspect we're having trouble modeling is the concept of data availability. A dataset will have a data value for various entity pairs. To generalize, say we have two entities: Apples and Oranges. Therefore, there's a data value for various Apple and Orange pairs (e.g. apple1 orange5 have value 6.566). The question we want to model is which Apples have data for a specific set of Oranges. The problem is that the list of Oranges can be ~2000. Our first (and albeit ugly) approach was to create a dataAvailability field in each Apple document. It's a multi-valued field that holds a list of Oranges (actually a list of Orange IDs) that have data for that specific Apple. Our facet query then becomes ...facet.query=dataAvailability:(1 OR 2 OR 4 OR 45 OR 200 OR ...)... For 1000 Oranges, the query takes a long time to run the first time a user performs it (afterwards it gets cached so it runs fairly quickly). Any thoughts on how to speed this up? Is there a better model to use? One idea was to use the autowarming features. However, the list of Oranges will always be dynamically built by the user (and it's not feasible to autowarm all possible permutations of ~2000 Oranges =)). Hope the generalization isn't too stupid, and thanks in advance! Cheers, Luis -- Sincerely yours Mikhail Khludnev Principal Engineer, Grid Dynamics http://www.griddynamics.com mkhlud...@griddynamics.com