[
https://issues.apache.org/jira/browse/SOLR-2155?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13238921#comment-13238921
]
Bill Bell commented on SOLR-2155:
---------------------------------
David,
We are seeing weird slow performance on your new 1.0.4 release.
INFO: [providersearch] webapp=/solr path=/select
params={d=160.9344&facet=false&wt=json&rows=6&start=0&pt=42.6450,-73.7777&facet.field=5star_45&f.5star_45.facet.mincount=1&qt=providersearchspecdist&fq=specialties_ids:(45+)&qq=city_state_lower:"albany,+ny"&f.5star_45.facet.limit=-1}
hits=960 status=0 QTime=8222
Hitting that with a slightly different lat long comes back almost instantly.
I'm not sure why sometimes they take seconds instead of milliseconds. There is
also this log entry a few lines before the long query:
Mar 22, 2012 11:26:29 AM solr2155.solr.search.function.GeoHashValueSource <init>
INFO: field 'store_geohash' in RAM: loaded min/avg/max per doc #:
(1,1.1089503,11) #2270017
Are we missing something? Shall we go back to 1.0.3 ?
Shall I increase the following? What does this actually do?
<cache name="fieldValueCache"
class="solr.FastLRUCache" size="10" initialSize="1"
autowarmCount="1"/>
> Geospatial search using geohash prefixes
> ----------------------------------------
>
> Key: SOLR-2155
> URL: https://issues.apache.org/jira/browse/SOLR-2155
> Project: Solr
> Issue Type: Improvement
> Reporter: David Smiley
> Attachments: GeoHashPrefixFilter.patch, GeoHashPrefixFilter.patch,
> GeoHashPrefixFilter.patch,
> SOLR-2155_GeoHashPrefixFilter_with_sorting_no_poly.patch, SOLR.2155.p3.patch,
> SOLR.2155.p3tests.patch, Solr2155-1.0.2-project.zip,
> Solr2155-1.0.3-project.zip, Solr2155-1.0.4-project.zip,
> Solr2155-for-1.0.2-3.x-port.patch
>
>
> There currently isn't a solution in Solr for doing geospatial filtering on
> documents that have a variable number of points. This scenario occurs when
> there is location extraction (i.e. via a "gazateer") occurring on free text.
> None, one, or many geospatial locations might be extracted from any given
> document and users want to limit their search results to those occurring in a
> user-specified area.
> I've implemented this by furthering the GeoHash based work in Lucene/Solr
> with a geohash prefix based filter. A geohash refers to a lat-lon box on the
> earth. Each successive character added further subdivides the box into a 4x8
> (or 8x4 depending on the even/odd length of the geohash) grid. The first
> step in this scheme is figuring out which geohash grid squares cover the
> user's search query. I've added various extra methods to GeoHashUtils (and
> added tests) to assist in this purpose. The next step is an actual Lucene
> Filter, GeoHashPrefixFilter, that uses these geohash prefixes in
> TermsEnum.seek() to skip to relevant grid squares in the index. Once a
> matching geohash grid is found, the points therein are compared against the
> user's query to see if it matches. I created an abstraction GeoShape
> extended by subclasses named PointDistance... and CartesianBox.... to support
> different queried shapes so that the filter need not care about these details.
> This work was presented at LuceneRevolution in Boston on October 8th.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators:
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]