Excellent tips. I think I'm really going to have to polish my sql skills for
this task. Any good tools for benchmarking sql queries?

-----Original Message-----
From: Wouter van Vliet [mailto:[EMAIL PROTECTED]
Sent: Wednesday, November 05, 2003 7:06 PM
To: 'Luis Lebron'; 'Jay Blanchard'; 'Marco Tabini'; [EMAIL PROTECTED]
Cc: 'Php-General (E-mail)'
Subject: RE: [PHP] High bandwidth application tips


That depends on the moment this emailing is sent, and how. If you do it at
night, when there's less visitors (assuming you run a site mostly meant for
ppl in 'your own country'), there's no real need to get a second server to
to this. 

I fourth the thing about database access. As long as you realize that
reading from disk isn't the fastest thing around either. Make sure you
reduce the number of files to be read to as little as possible. And output
with something like readfile() to prevent the files being loaded into
memory.

Some other things to think about (not only for the sake of performance, but
also readability):
        
* Use elseif or switch as much as possible rather than if statements
One time I had this script somebody else wrote. About 1000 lines, a complete
CMS in one file. It if'ed on simple $_GET vars about 10 times, by opening a
new if statement each and every time. After I changed this to if ($_GET['p']
== 'one') { .. } elseif ($_GET['p'] == 'two') { .. }; and so on the script
became TWO SECONDS faster. Switch { case 'one': ... }; was not an option
here, cuz there were more advanced expressions to be evaluated.

* Unset when not used anymore
If you load a large amount of data in a variable that exists untill the end
of your script but is used only untill a point higher, call unset($Var); to
free some memory

* Only code what is neccisery (still can't spell that word :D)
Logically, do not write something in 5 lines what can be done in 1. And only
create temp vars if you really need them. Usually this is when you use it's
value more than once. On the same hand, only call a function as much as you
need to. One time I found this in some code that reached me:

$ID = Array(1, 5, 3, 7, 3, 9);
$i = 1;
while($i <= count($ID)) {
        $query = "SELECT * FROM table WHERE id = '".$ID[$i]."'";
        $result = mysql_query($query);
        $value = mysql_fetch_array($result);

        /* .. Do stuff with $value .. */
}

I optimized it to:

$IDs = Array(1, 5, 3, 7, 3, 9);
$ResultSet = mysql_query('SELECT * FROM table WHERE id IN ('.join(',',
$IDs).')');
while($value = mysql_fetch_assoc($ResultSet)) {
        /* .. Do stuff with $value .. */
}

Don't think I have to explain why the second version is quite faster.

Let me know if some of my lines were of any value to you.. Or anybody else
;)

Wouter

-----Original Message-----
From: Luis Lebron [mailto:[EMAIL PROTECTED] 
Sent: Wednesday 05 November 2003 21:12
To: 'Jay Blanchard'; Marco Tabini; [EMAIL PROTECTED]
Cc: Luis Lebron; Php-General (E-mail)
Subject: RE: [PHP] High bandwidth application tips

One of the things I have suggested to the customer is offloading some of the
work to a different server. For example, he wants to email a weekly message
to all the users with information on records that match their search
criteria. I suggested setting up a second server that handles this and other
admin function while the main server takes care of the users.

Does this sound like a good idea?

thanks,

Luis

-----Original Message-----
From: Jay Blanchard [mailto:[EMAIL PROTECTED]
Sent: Wednesday, November 05, 2003 2:10 PM
To: Marco Tabini; [EMAIL PROTECTED]
Cc: Luis Lebron; Php-General (E-mail)
Subject: RE: [PHP] High bandwidth application tips


[snip]
> limit the number of times you need to hit the database.
> 
I second Chris on this.
[/snip]

I third that. The problem can become especially apparent in large databases
containing millions of records. Other than that just code cleanly and
document, document, document.

Reply via email to