Unfortunately you can't index a specific item with Sphinx - only a specific 
index. So, in this case, either user_core or user_delta.

-- 
Pat

On 01/03/2012, at 2:53 PM, Krishnaprasad Varma wrote:

> Hi Pat,
> 
> Thank you . I will try this today.
> 
> One more thing : How will i reindex a particular item
> 
> for example : u = User.last
> and that user has not been indexed due to background job failure or
> something
> 
> If i need to reindex i usually use to save it again which will change
> the updated_at time which i dont need.
> 
> is there any way to particularly index that record ?
> 
> 
> 
> 
> On Feb 24, 9:07 am, "Pat Allan" <[email protected]> wrote:
>> Try the following instead:
>> 
>>   User.define_indexes
>>   User.index_delta
>> 
>> --
>> Pat
>> 
>> On 22/02/2012, at 12:17 AM, Krishnaprasad Varma wrote:
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>>> i tried this but throws error
>> 
>>>>> User.index_delta
>>> undefined method `index_delta' for #<Class:0x007fa1b1617ad8>
>> 
>>> On Feb 18, 3:35 pm, "Pat Allan" <[email protected]> wrote:
>>>> There's no easy way to determine which rows have been skipped - but you 
>>>> can either re-run the delta index again for that model:
>> 
>>>>   User.index_delta
>> 
>>>> Or, if you want to index everything for that model, you can do this like 
>>>> so:
>> 
>>>>   indexer --config config/production.sphinx.conf --rotate user_core 
>>>> user_delta
>> 
>>>> Cheers
>> 
>>>> --
>>>> Pat
>> 
>>>> On 18/02/2012, at 6:23 PM, Krishnaprasad Varma wrote:
>> 
>>>>> Hi
>> 
>>>>> I have a user table which is indexed with delayed jobs. Everything is
>>>>> working quite well .
>>>>> Sometimes when the delayed jobs server goes down, the delta index
>>>>> creation fails . hence there exists a mis match between the counts as
>>>>> explained below
>> 
>>>>> User.count = 1002
>>>>> User.search.total_entries = 820
>> 
>>>>> The remaining 180 has to be re processed to create the delta indexes .
>> 
>>>>> I Tried rebuilding the indexes . But I have other models as well and
>>>>> rebuilding can be done only together which takes more than a hour as i
>>>>> have millions of data in one of the models .
>> 
>>>>> How can i tackle this situation .
>> 
>>>>> Is there any way to index a particular model ? that to without
>>>>> stopping the searchd daemon ?
>>>>> Is there any way to check weather all the items in the table has been
>>>>> indexed ? if not what are the rows which has been skipped ?
>> 
>>>>> Thank  you
>> 
>>>>> --
>>>>> You received this message because you are subscribed to the Google Groups 
>>>>> "Thinking Sphinx" group.
>>>>> To post to this group, send email to [email protected].
>>>>> To unsubscribe from this group, send email to 
>>>>> [email protected].
>>>>> For more options, visit this group 
>>>>> athttp://groups.google.com/group/thinking-sphinx?hl=en.
>> 
>>> --
>>> You received this message because you are subscribed to the Google Groups 
>>> "Thinking Sphinx" group.
>>> To post to this group, send email to [email protected].
>>> To unsubscribe from this group, send email to 
>>> [email protected].
>>> For more options, visit this group 
>>> athttp://groups.google.com/group/thinking-sphinx?hl=en.
> 
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Thinking Sphinx" group.
> To post to this group, send email to [email protected].
> To unsubscribe from this group, send email to 
> [email protected].
> For more options, visit this group at 
> http://groups.google.com/group/thinking-sphinx?hl=en.
> 


-- 
You received this message because you are subscribed to the Google Groups 
"Thinking Sphinx" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to 
[email protected].
For more options, visit this group at 
http://groups.google.com/group/thinking-sphinx?hl=en.

Reply via email to