From what I understand, the larger max_matches is, the more Sphinx has to process, and thus the slower it will get. Whether it's noticeably slower, I've no idea.
But echoing James' points - if it's not really search, then I'm not convinced Sphinx/TS is the best approach. -- Pat On 19/08/2009, at 2:39 AM, James Healy wrote: > > phil wrote: >> I couldn't understand why, even though I was passing in a , :per_page >> => 100000 option I was only getting back 1000. It seems there is this >> max matches setting. >> Is there any downside to bumping that up to a ridiculously large >> amount. In every instance we page things reasonably EXCEPT when we >> are >> producing sitemaps. > > I don't have any reliable info on downsides to making max_matches > ridiculously large, other than pointing out it will instantiate plenty > of objects and you may have memory constraints. > > Is sphinx the best tool to use when building a sitemap though? It > really > shines when used for user searches, but for iterating over large > datasets AR may be a better fit (particularly if you're using rails > 2.2+ > and have access to the each method). > > -- James Healy <jimmy-at-deefa-dot-com> Wed, 19 Aug 2009 11:34:25 > +1000 > > > --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Thinking Sphinx" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/thinking-sphinx?hl=en -~----------~----~----~----~------~----~------~--~---
