|
Yep. a custom query is seemingly the way forward on this. I use
containable a lot for other areas. Pagination is not applicable in this
scenario because I'm not displaying the results to anybody :) The
resultant user ids are saved in another table for a cron job to notify
the users. Giving the custom query a try right now. Thanks! Femi Martin Westin wrote: If you want to stay in Cake-land as much as possible try making use of the joins key in your options for find(). Nate wrote a nice article showing how to get going: http://bakery.cakephp.org/articles/view/quick-tip-doing-ad-hoc-joins-in-model-findGeneral advice (apart from your big search function) is: - Use Containable everywhere :) - Paginate any of these big datasets. Well, any dataset with a reasonable expectation of ever returning more than 25 records. So that would be most models. When you are doing that tricky joining query using the paginator might be tricky byt don't e lazy. Make your own paginator for that action. You can probably skip the ordering magic from paginator but simple paging of the results can be done quickly the bad old way. - Depending on the data you might get good results caching it. I usually don't but that is the fault of the type of data I most often deal with. If you can get away with the results sometimes being a few minutes "out of date" then caching could be for you. - I have also done my own per-request caching when the same data is likely to be found multiple times (yes, special cases). I got a huge leap in speed compared to the built-in query caching byt doing a special trick for my special situation. I use a combination of these techniques to crawl through datasets that are 50 times bigger (and more) than your 20'000 and still going like a rocket. But I am no genius. I was in your shoes last year when we went from 50 to 400'000 unique records in 3 days... man, was that a fun summer? :) I learned a lot by transferring the live data to my dev- setup and (very quickly) optimizing sloppy queries and limiting results the right way and so on. Keep asking specific questions if you have them. I'd be happy to help. If for nothing else...at least we can help show that Cake is not the slow-coach among php frameworks. /Martin On Jun 16, 12:42 pm, "Adam Royle" <[email protected]> wrote: --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "CakePHP" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/cake-php?hl=en -~----------~----~----~----~------~----~------~--~--- |
- Problem with Large Datasets Femi Taiwo
- Re: Problem with Large Datasets Adam Royle
- Re: Problem with Large Datasets Femi Taiwo
- Re: Problem with Large Datasets Adam Royle
- Re: Problem with Large Datasets Martin Westin
- Re: Problem with Large Datasets Femi Taiwo
- Re: Problem with Large Datasets James K
- Re: Problem with Large Datasets brian
- Re: Problem with Large Datasets Femi Taiwo
