My question is regarding using Solr for doing fast geospatial calculations 
against multiple locations.  For example we have a product that takes 2 to 10 
companies at a time (i.e. McDonalds 14,000 Subway 20,000, Duncan Donuts 5000), 
and identifies and maps any store overlap based on a range between 0.1 or 20 
mile radius.   As you are probably aware with this many locations performing 
these calculations on the fly just takes too long.   Our initial solution was 
to process all distance calculations via a nightly process so the system just 
needs to retrieve them from the database.  This for the most part has work 
really well and returns results no matter how large the dataset almost 
immediately.

I know that Solr is very fast, especially in the Geospatial queries, but is 
there any way it could be faster doing millions of on the fly geospatial 
calculations, then having the calculations already done and just retrieving 
them from the Database?   If it could I would not need to run a nightly process 
that pre calculates the distances.  Thougths?

Regards,

Joe


Joseph Costello
Chief Information Officer

F&D Reports | Creditntell | ARMS
===========================
Information Clearinghouse Inc. & Market Service Inc.
310 East Shore Road, Great Neck, NY 11023
email: jose...@fdreports.com<mailto:jose...@fdreports.com> | Tel: 800.789.0123 
ext 112 | Cell: 516.263.6555 | 
www.informationclearinghouseinc.com<http://www.informationclearinghouseinc.com/>

[Help Desk]<mailto:soluti...@fdreports.com>    Need Help?  Click 
here<mailto:soluti...@fdreports.com> to request IT Support.

Reply via email to