I see the same error from my php/curl script when my request times out. I
believe you can up your timeouts both in php/curl and in your solr configs.
On Wed, Jan 4, 2012 at 3:15 PM, Brian Lamb brian.l...@journalexperts.comwrote:
Hi Param,
That's the method I'm switching over from. It seems that script works
inefficiently with my set up as the data is spread out over multiple
tables. I've considered creating a simple solr MySQL table just to maintain
the solr data but I wanted to try out this PHP extension first.
But thanks for the suggestion!
Brian Lamb
On Wed, Jan 4, 2012 at 2:58 PM, Sethi, Parampreet
parampreet.se...@teamaol.com wrote:
Hi Brian,
Not exactly solution to your problem. But it may help, you can run Solr
directly on top of your database, if your schema is simple manipulation
of
the database fields. This way you only need to update the database and
solr index will be automatically updated with the latest data. I am using
this in production and it's working pretty neatly.
Here are few helpful links:
http://wiki.apache.org/solr/DataImportHandler
http://www.params.me/2011/03/configure-apache-solr-14-with-mysql.html
-param
On 1/4/12 2:50 PM, Brian Lamb brian.l...@journalexperts.com wrote:
Hi all,
I've been exploring http://www.php.net/manual/en/book.solr.php as a way
to
maintain my index. I already have a PHP script that I use to update a
database so I was hoping to be able to update the database at the same
time
I am updating the index.
However, I've been getting the following error when trying to run
$solr_client-commit();
Unsuccessful update request. Response Code 0. (null)
I've tried looking to see why I'm getting the error but I cannot find a
reasonable explanation. My guess is that it is because my index is
rather
large (22 million records) and thus it is timing out or something like
that
but I cannot confirm that that is the case nor do I know how to fix it
even
if it were.
Any help here would be greatly appreciated.
Thanks,
Brian Lamb