The version of 1.2 I'm using does use the update servlet and would return 400
( or similar ) if something went wrong, and 200 if OK, but, like you
suggested, perhaps a 200 does not entirely mean it completely worked.

It sounds like 1.3 is the way to go. I will start with the 1.3 config and
schema and work from there to create our fields and see what happens. 

Is there any problem using our existing 1.2 index in 1.3?




Yonik Seeley wrote:
> 
> On Mon, Jun 16, 2008 at 6:07 PM, dls1138 <[EMAIL PROTECTED]> wrote:
>> I'm getting all 200 return codes from Solr on all of my batches.
> 
> IIRC, Solr1.2 uses the update servlet and always returns 200 (you need
> to look at the response body to see if there was an error or not).
> 
>> I skimmed the logs for errors, but I didn't try to grep for "Exception".
>> I
>> will take your advice look there for some clues.
>>
>> Incidentally I'm running solr 1.2 using Jetty. I'm not on 1.3 because I
>> read
>> it wasn't released yet. Is there a (more stable than 1.2) branch of 1.3 I
>> should be using instead?
> 
> If you aren't going to go into production for another month or so, I'd
> start using 1.3
> Start off with a new solrconfig.xml from 1.3 and re-make any
> customizations to make sure you get the latest behavior.
> 
>> I know 1.2 is obviously dated, and came packaged with an old version of
>> Lucene. Should I update either or both?
> 
> Solr takes care of updating Lucene for you... I wouldn't recommend
> changing the version of Lucene independent of Solr unless you are
> pretty experienced in Lucene.
> 
> -Yonik
> 
>>
>>
>>
>>
>> Yonik Seeley wrote:
>>>
>>> No records should be dropped, regardless of if a commit or optimize is
>>> going on.
>>> Are you checking the return codes (HTTP return codes for Solr 1.3)?
>>> Some updates could be failing for some reason.
>>> Also grep for "Exception" in the solr log file.
>>>
>>> -Yonik
>>>
>>> On Mon, Jun 16, 2008 at 4:02 PM, dls1138 <[EMAIL PROTECTED]> wrote:
>>>>
>>>> I've been sending data in batches to Solr with no errors reported, yet
>>>> after
>>>> a commit, over 50% of the records I added (before the commit) do not
>>>> show
>>>> up- even after several subsequent commits down the road.
>>>>
>>>> Is it possible that Solr/Lucene could be disregarding or dropping my
>>>> add
>>>> queries if those queries were executed while a commit was running?
>>>>
>>>> For example, if I add 300 records, and then do a commit- during the
>>>> 10-20
>>>> seconds for the commit to execute (on an index over 1.2M records), if I
>>>> add
>>>> 100 more records during that 10-20 second time period, are those adds
>>>> lost?
>>>> I'm assuming they are not and will be visible after the next commit,
>>>> but
>>>> I
>>>> want to be sure as it seems that some are being dropped. I just need to
>>>> know
>>>> if this can happen during commits or if I should be looking elsewhere
>>>> to
>>>> resolve my dropped record problem.
>>>>
>>>> Thanks.
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://www.nabble.com/Adding-records-during-a-commit-tp17872257p17872257.html
>>>> Sent from the Solr - User mailing list archive at Nabble.com.
>>>>
>>>>
>>>
>>>
>>
>> --
>> View this message in context:
>> http://www.nabble.com/Adding-records-during-a-commit-tp17872257p17874274.html
>> Sent from the Solr - User mailing list archive at Nabble.com.
>>
>>
> 
> 

-- 
View this message in context: 
http://www.nabble.com/Adding-records-during-a-commit-tp17872257p17874662.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to