It's possible to use a combination of cake's magic and raw sql together, which 
I use occasionally when I need something a bit out of the ordinary but still 
want to use the cake methods (to make use of model binding, etc).

eg.

function getHomepageArtists() {
    $query = array(
        'fields' => array('id', 'name', 'slug'),
        'conditions' => 'Artist.id IN (SELECT artists_events.artist_id FROM 
artists_images INNER JOIN artists_events ON artists_events.artist_id = 
artists_images.artist_id)',
        'contain' => array('Image'),
        'order' => 'popularity, RAND()',
    );    
    return $this->find('all', $query);
}


Hope that helps,
Adam

----- Original Message ----- 
  From: Femi Taiwo 
  To: [email protected] 
  Sent: Tuesday, June 16, 2009 8:32 PM
  Subject: Re: Problem with Large Datasets


  Thanks for replying.

  Actually, I've got everything properly indexed and closely followed standards 
on that. I honestly just wanted to avoid writing a custom query (lazy me). Was 
only hoping that there was a way to avoid custom queries. Will give that a try 
then.

  Thanks!

  Femi


  Adam Royle wrote: 
    Hi Femi,

    You should really just use one query for that with a number of joins and 
let the database do all the work. My advice is to get someone familiar with sql 
and construct the raw query and get it running at a decent speed inside the 
mysql console, and then find a way to get that sql into cake. 

    I'm guessing you don't have any indexes on your tables either, so best to 
get that same person to look over them. 20,000 records is not a lot and mysql 
can handle that easily if you know what you are doing.

    Best of luck!

    Cheers,
    Adam

    ----- Original Message ----- 
      From: Femi Taiwo 
      To: Cake PHP 
      Sent: Tuesday, June 16, 2009 7:05 PM
      Subject: Problem with Large Datasets


      Hi all,

      I'm having a scalability problem, and urgently need to refactor. The 
project is just 2 months old and already has over 20,000 users.
      How do you handle passing an array with 5000+ values as a condition 
      e.g. $conditions = array('User.id'=>$bigArray); 
      As the records grow bigger, it takes longer. 

      I'm not looking to paginate because what happens is that an alert is sent 
to the users in the final result set. Browsing through the result set is NOT 
necessary.

      Models?
      I have a User model  with the following key relationships

          hasMany UserDegree
          hasMany Certification
          hasMany Skill
          hasMany Language
          belongsTo City

      The User model has a few columns as well - dob, marital status  and 
gender.

      Herein lies the problem - a user can set different criteria for searching 
through the entire user database using any of the attributes above.
      She can say for example, I want users in Kansas, aged between 22 and 27, 
who have degrees in either Computer Science, Mathematics or Physics, has 
Marketing skills
      and can speak a bit of  French.

      What I do right now is select from the users table, those aged between 22 
and 27, then amongst these ones, those with the relevant degrees => then get 
from this set, the ones that have marketing skills. So I keep filtering using 
their user_ids. and passing them to the relevant model using conditions like 
$conditions = array('UserDegree.user_id'=>$bigArray); 

      Initially, this worked well with a few thousand records. But now with 
20,000. Takes  3-5 seconds and is really cpu intensive and hard on the MySQL 
server.

      What can I do?

      Thanks!!

      Femi












  

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"CakePHP" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/cake-php?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to