Thanks everyone. I got it working.
--
Craig Hoffman
w: http://www.craighoffmanphotography.com
FB: www.facebook.com/CraigHoffmanPhotography
TW: https://twitter.com/craiglhoffman
> On Oct 30, 2014, at 1:48 PM, Shawn Heisey wrote:
>
> On 10/30/2014 1:27 PM, Craig Hoffman wrote:
>> Thank
On 10/30/2014 1:27 PM, Craig Hoffman wrote:
> Thanks! One more question. WGET seems to choking on a my URL in particular
> the # and the & character . What’s the best method escaping?
>
> http://
> :8983/solr/#/articles/dataimport//dataimport?command=full-import&clean=true&optimize=true
Putting
ontext:
http://lucene.472066.n3.nabble.com/Automating-Solr-tp4166696p4166707.html
Sent from the Solr - User mailing list archive at Nabble.com.
ataImportHandler exposes a variable
> called last_index_time which is a timestamp value denoting the last time
> full-import 'or' delta-import was run. You can use this variable anywhere in
> the SQL you write in data-config.xml and it will be replaced by the value
> during processing.
>
>
>
>
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/Automating-Solr-tp4166696p4166707.html
> Sent from the Solr - User mailing list archive at Nabble.com.
processing.
--
View this message in context:
http://lucene.472066.n3.nabble.com/Automating-Solr-tp4166696p4166707.html
Sent from the Solr - User mailing list archive at Nabble.com.
Do you mean DataImportHandler? If so, you can create full and
incremental queries and trigger them - from CRON - as often as you
would like. E.g. 1am nightly.
Regards,
Alex.
On 30 October 2014 14:17, Craig Hoffman wrote:
> The data gets into Solr via MySQL script.
Then you have to run it again and again
30. okt. 2014 19:18 skrev "Craig Hoffman" følgende:
> The data gets into Solr via MySQL script.
> --
> Craig Hoffman
> w: http://www.craighoffmanphotography.com
> FB: www.facebook.com/CraigHoffmanPhotography
> TW: https://twitter.com/craiglhoffman
>
>
>
>
>
The data gets into Solr via MySQL script.
--
Craig Hoffman
w: http://www.craighoffmanphotography.com
FB: www.facebook.com/CraigHoffmanPhotography
TW: https://twitter.com/craiglhoffman
> On Oct 30, 2014, at 12:11 PM, Craig Hoffman wrote:
>
> Right, of course. The data changes every fe
Right, of course. The data changes every few days. According to this
article, you can run a CRON Job to create a new index.
http://www.finalconcept.com.au/article/view/apache-solr-hints-and-tips
On Thu, Oct 30, 2014 at 12:04 PM, Alexandre Rafalovitch
wrote:
> You don't "reindex Solr". You reinde
You don't "reindex Solr". You reindex data into Solr. So, this depends
where you data is coming from and how often it changes. If the data
does not change, no point re-indexing it. And how do you get the data
into the Solr in the first place?
Regards,
Alex.
Personal: http://www.outerthoughts.co
Simple question:
What is best way to automate re-indexing Solr? Setup a CRON JOB / Curl Script?
Thanks,
Craig
--
Craig Hoffman
w: http://www.craighoffmanphotography.com
FB: www.facebook.com/CraigHoffmanPhotography
TW: https://twitter.com/craiglhoffman
11 matches
Mail list logo