Hi there,
I have a table I'm indexing that has roughly 800,000 rows. From
reading around online and in this group I feel like it's taking a long
time for my index to get generated.
I have the following in my model:
define_index do
indexes title, body
has x_id
has locked
has created_at, y_id
set_property :delta => true
end
In my sphinx.yml file, I have the following:
max_matches: 1000
html_strip: 1
sql_range_step: 10000000
min_word_len: 3
mem_limit: 256M
And here is sample output from running rake ts:index:
indexing index 'entry_core'...
collected 841492 docs, 1783.4 MB
collected 0 attr values
sorted 0.8 Mvalues, 100.0% done
sorted 205.5 Mhits, 100.0% done
total 841492 docs, 1783446653 bytes
total 1343.154 sec, 1327804.99 bytes/sec, 626.50 docs/sec
indexing index 'entry_delta'...
collected 0 docs, 0.0 MB
collected 0 attr values
sorted 0.0 Mvalues, nan% done
total 0 docs, 0 bytes
total 250.385 sec, 0.00 bytes/sec, 0.00 docs/sec
distributed index 'entry' can not be directly indexed; skipping.
It's taking around 25-30 minutes to run (without having any delta
indexes). That seems like quite a while compared to what I've seen as
sample times from other people. Does anybody have any suggestions for
what I could do to improve the performance, or any comments on the
speed of the indexing compared to what they have seen?
Thanks,
Simon
--
You received this message because you are subscribed to the Google Groups
"Thinking Sphinx" group.
To post to this group, send email to [email protected].
To unsubscribe from this group, send email to
[email protected].
For more options, visit this group at
http://groups.google.com/group/thinking-sphinx?hl=en.