Re: [PHP] High bandwidth application tips

2003-11-09 Thread dr. zoidberg
Radu Manole wrote:

Hi all,

I do have a question about optimizing the php for large applications.
Many applications group the functions in files (eg. functions.inc.php) or
build classes, and these files/classes are called with 'require' or
'include' on the top of each main file.
What would be the speed penalty if we have to include many files or if these
functions containter files are over 100k in size. I know the php engine will
parse them too. Are any tips to follow when using many functions?
It is much better to include only one big file, but only if you need all 
of that functions. You should try to separate your functions into 3 or + 
files: LoginFunctions.php, OtherFunctions.php, PictureFunctions.php and 
include only the ONE you need. If you need all of them, make only one 
file, and include only one file.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] High bandwidth application tips

2003-11-06 Thread Eugene Lee
On Wed, Nov 05, 2003 at 05:17:29PM -0800, Chris W. Parker wrote:
: 
: One thing you can do to make loops faster is the following: (yes I've
: posted this in the past!!)
: 
: Unoptimized:
: 
: $upperlimit = 100;
: 
: for($i = 0;$i  $upperlimit; $++)
: {
:   // your stuff
: }
: 
: Optimized:
: 
: $upperlimit = 100;
: $i = -1;
: 
: while(++$i  $upperlimit)
: {
:   // your stuff
: }

Still, doing a greater-than or less-than comparison is a bit slow.
You could try this:

$upperlimit = 100;
while ($up-- != 0)
{
//  your stuff
}

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] High bandwidth application tips

2003-11-06 Thread Luis Lebron
Excellent tips. I think I'm really going to have to polish my sql skills for
this task. Any good tools for benchmarking sql queries?

-Original Message-
From: Wouter van Vliet [mailto:[EMAIL PROTECTED]
Sent: Wednesday, November 05, 2003 7:06 PM
To: 'Luis Lebron'; 'Jay Blanchard'; 'Marco Tabini'; [EMAIL PROTECTED]
Cc: 'Php-General (E-mail)'
Subject: RE: [PHP] High bandwidth application tips


That depends on the moment this emailing is sent, and how. If you do it at
night, when there's less visitors (assuming you run a site mostly meant for
ppl in 'your own country'), there's no real need to get a second server to
to this. 

I fourth the thing about database access. As long as you realize that
reading from disk isn't the fastest thing around either. Make sure you
reduce the number of files to be read to as little as possible. And output
with something like readfile() to prevent the files being loaded into
memory.

Some other things to think about (not only for the sake of performance, but
also readability):

* Use elseif or switch as much as possible rather than if statements
One time I had this script somebody else wrote. About 1000 lines, a complete
CMS in one file. It if'ed on simple $_GET vars about 10 times, by opening a
new if statement each and every time. After I changed this to if ($_GET['p']
== 'one') { .. } elseif ($_GET['p'] == 'two') { .. }; and so on the script
became TWO SECONDS faster. Switch { case 'one': ... }; was not an option
here, cuz there were more advanced expressions to be evaluated.

* Unset when not used anymore
If you load a large amount of data in a variable that exists untill the end
of your script but is used only untill a point higher, call unset($Var); to
free some memory

* Only code what is neccisery (still can't spell that word :D)
Logically, do not write something in 5 lines what can be done in 1. And only
create temp vars if you really need them. Usually this is when you use it's
value more than once. On the same hand, only call a function as much as you
need to. One time I found this in some code that reached me:

$ID = Array(1, 5, 3, 7, 3, 9);
$i = 1;
while($i = count($ID)) {
$query = SELECT * FROM table WHERE id = '.$ID[$i].';
$result = mysql_query($query);
$value = mysql_fetch_array($result);

/* .. Do stuff with $value .. */
}

I optimized it to:

$IDs = Array(1, 5, 3, 7, 3, 9);
$ResultSet = mysql_query('SELECT * FROM table WHERE id IN ('.join(',',
$IDs).')');
while($value = mysql_fetch_assoc($ResultSet)) {
/* .. Do stuff with $value .. */
}

Don't think I have to explain why the second version is quite faster.

Let me know if some of my lines were of any value to you.. Or anybody else
;)

Wouter

-Original Message-
From: Luis Lebron [mailto:[EMAIL PROTECTED] 
Sent: Wednesday 05 November 2003 21:12
To: 'Jay Blanchard'; Marco Tabini; [EMAIL PROTECTED]
Cc: Luis Lebron; Php-General (E-mail)
Subject: RE: [PHP] High bandwidth application tips

One of the things I have suggested to the customer is offloading some of the
work to a different server. For example, he wants to email a weekly message
to all the users with information on records that match their search
criteria. I suggested setting up a second server that handles this and other
admin function while the main server takes care of the users.

Does this sound like a good idea?

thanks,

Luis

-Original Message-
From: Jay Blanchard [mailto:[EMAIL PROTECTED]
Sent: Wednesday, November 05, 2003 2:10 PM
To: Marco Tabini; [EMAIL PROTECTED]
Cc: Luis Lebron; Php-General (E-mail)
Subject: RE: [PHP] High bandwidth application tips


[snip]
 limit the number of times you need to hit the database.
 
I second Chris on this.
[/snip]

I third that. The problem can become especially apparent in large databases
containing millions of records. Other than that just code cleanly and
document, document, document.



Re: [PHP] High bandwidth application tips

2003-11-06 Thread CPT John W. Holmes
From: Luis Lebron [EMAIL PROTECTED]

 Excellent tips. I think I'm really going to have to polish my sql skills
for
 this task. Any good tools for benchmarking sql queries?

If you've been following the Load Stress Tool thead, this program:
http://jakarta.apache.org/jmeter/index.html was mentioned. The web site
mentions that it can be used to also benchmark SQL queries through JDBC.
Seems very useful indeed... :)

---John Holmes...

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] High bandwidth application tips

2003-11-06 Thread Wouter van Vliet
Yes, a filesystem hit is a filesystem hit .. And yes, be worried about the
number of GIF files you serve. Less is always better in this. 

What I meant in my advice about the readfile() function is that it makes a
big deal of a difference whether a file is sent directly to the client, or
first also read in the momory of you php file. Just try it.
- Make a file with about 1 lines (or more, just to make the
difference measurable by a regular watch.
- make two php scripts
-1. ?php readfile('/path/file.txt'); ?
-2. ?php $fd = fopen('/path/file.txt', 'r'); $Buffer =
fread($fd, filesize('/path/file.txt'); print $Buffer; ?

- Notice the difference in speed and lines of code.

 Creating indexes on columns that appear in your WHERE clauses can 
 drastically increase performance when hitting the database. Be sure to 
 read your database documentation on creating indexes.

Yes, this is true. But please be aware of the fact that INSERTS on tables
with indexes are in fact slower. Just as UPDATES on indexed columns. Also,
if you have a query like:

SELECT column1, column2, column3 FROM tablename WHERE column4 =
'foo' AND column5 = 'bar';

Having an index on just column4 or just column5 doesn't do any good. You
will need a index on BOTH columns 

alter table `tablename` add index name (column4, column5) 

I think the syntax would be.

Gosh, this is getting more and more interesting.. (seriously)

Wouter

-Original Message-
From: olinux [mailto:[EMAIL PROTECTED] 
Sent: Thursday 06 November 2003 07:28
To: [EMAIL PROTECTED]
Subject: RE: [PHP] High bandwidth application tips


[snip]

 I fourth the thing about database access. As long as you realize that 
 reading from disk isn't the fastest thing around either. Make sure you 
 reduce the number of files to be read to as little as possible. And 
 output with something like readfile() to prevent the files being 
 loaded into memory.

[/snip]

A filesystem hit is a filesystem hit whether your requesting a php file or
an image for a button. If you are worried about filesystem hits then
shouldn't you also be worried about uneccessarily using GIF's etc.
in your page layouts. Likewise cleaning up bloated HTML code and properly
using CSS can cut down page filesizes dramatically, saving bandwidth for the
server and clients. If users are potentially using dialup, cutting 20K off
your pages will make them a lot happier than shaving a couple tenths of a
second off the backend processes. (not saying you should not be performance
focused on the backend as well.)

olinux



__
Do you Yahoo!?
Protect your identity with Yahoo! Mail AddressGuard
http://antispam.yahoo.com/whatsnewfree

--
PHP General Mailing List (http://www.php.net/) To unsubscribe, visit:
http://www.php.net/unsub.php

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] High bandwidth application tips

2003-11-06 Thread Radu Manole
Hi all,

I do have a question about optimizing the php for large applications.
Many applications group the functions in files (eg. functions.inc.php) or
build classes, and these files/classes are called with 'require' or
'include' on the top of each main file.

What would be the speed penalty if we have to include many files or if these
functions containter files are over 100k in size. I know the php engine will
parse them too. Are any tips to follow when using many functions?

Thanks,
Radu




 From: Luis Lebron [EMAIL PROTECTED]

  Excellent tips. I think I'm really going to have to polish my sql skills
 for
  this task. Any good tools for benchmarking sql queries?

 If you've been following the Load Stress Tool thead, this program:
 http://jakarta.apache.org/jmeter/index.html was mentioned. The web site
 mentions that it can be used to also benchmark SQL queries through JDBC.
 Seems very useful indeed... :)

 ---John Holmes...

 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] High bandwidth application tips

2003-11-06 Thread Chris Shiflett
--- Luis Lebron [EMAIL PROTECTED] wrote:
 Any good tools for benchmarking sql queries?

This may not directly answer your question, but I find the mytop utility
very helpful for seeing what is happening with a MySQL server. It's
available here:

http://jeremy.zawodny.com/mysql/mytop/

Another thing you can do is  configure MySQL to log slow queries, and
configure slow queries to be more and more critical, so that you can
focus on the queries that would make the biggest difference. Sorry fi
you're not using MySQL, since I'm assuming you are. :-)

Of course, I still think that avoiding database calls as much as possible
is a good thing.

Hope that helps.

Chris

=
My Blog
 http://shiflett.org/
HTTP Developer's Handbook
 http://httphandbook.org/
RAMP Training Courses
 http://www.nyphp.org/ramp

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] High bandwidth application tips

2003-11-06 Thread Luis Lebron
I guess there is a configuration option in my.cnf for logging slow queries?

Luis

-Original Message-
From: Chris Shiflett [mailto:[EMAIL PROTECTED]
Sent: Thursday, November 06, 2003 9:11 AM
To: Luis Lebron; 'Wouter van Vliet'; 'Jay Blanchard'; 'Marco Tabini';
[EMAIL PROTECTED]
Cc: 'Php-General (E-mail)'
Subject: RE: [PHP] High bandwidth application tips


--- Luis Lebron [EMAIL PROTECTED] wrote:
 Any good tools for benchmarking sql queries?

This may not directly answer your question, but I find the mytop utility
very helpful for seeing what is happening with a MySQL server. It's
available here:

http://jeremy.zawodny.com/mysql/mytop/

Another thing you can do is  configure MySQL to log slow queries, and
configure slow queries to be more and more critical, so that you can
focus on the queries that would make the biggest difference. Sorry fi
you're not using MySQL, since I'm assuming you are. :-)

Of course, I still think that avoiding database calls as much as possible
is a good thing.

Hope that helps.

Chris

=
My Blog
 http://shiflett.org/
HTTP Developer's Handbook
 http://httphandbook.org/
RAMP Training Courses
 http://www.nyphp.org/ramp


RE: [PHP] High bandwidth application tips

2003-11-06 Thread Chris Shiflett
--- Luis Lebron [EMAIL PROTECTED] wrote:
 I guess there is a configuration option in my.cnf for logging slow
 queries?

There, or you can pass it in as a command line argument when you start the
server. Here is a good URL for more information:

http://www.mysql.com/doc/en/Slow_query_log.html

You can keep decreasing the value of long_query_time as you improve your
queries.

Hope that helps.

Chris

=
My Blog
 http://shiflett.org/
HTTP Developer's Handbook
 http://httphandbook.org/
RAMP Training Courses
 http://www.nyphp.org/ramp

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] High bandwidth application tips

2003-11-06 Thread David T-G
Luis, et al --

...and then Luis Lebron said...
% 
% One of the things I have suggested to the customer is offloading some of the
% work to a different server. For example, he wants to email a weekly message
...
% Does this sound like a good idea?

If you can afford more servers, it's almost always good :-)

In addition, if you're going to be sending high volumes of mail, it's
very probably a good idea.

I recently coded up some parts of a marketing/fan site for promoting
music groups, video games, and such (surf over to killswitcharmy.com
and/or didofans.com if you're interested).  One thing we do is send
template-driven personalized (== one message per person) bulk mail to
our members.  With some casual tuning, I managed to get this quad-Xeon
box with 1G RAM and hw-mirrored SCSI drives down to about 120ms per
message injection, or 30K/hr, but then they have to get off of the box.

We're currently seeing a bottleneck in the mail queue as messages try to
get to other machines, have to wait to retry, and so on -- and it blocks
normal mail like our admin messages and the verification emails for new
subscribers.  Since we're at the mercy of the world's connectivity, it
doesn't matter whether we queue up a batch in the middle of the night or
any other time; even though 80% of our emails will make it out within a
couple of hours at most, on a 55K mailing that leaves 11K still hanging
around having to be retried (often repeatedly), and it takes a while to
get through those to send out a fresh email.

We're currently playing with setting up two separate queues so that all
of the bulk mail (both initial and retries) is handled separately, and at
low priority, while the rest of the mail is handled by the normal queue
(which will also get some hi-perf tuning), but that still leaves bounces
to be handled in the normal incoming stream (fortunately we can process
things very fast, even if the rest of the world can't, so that's not so
terribly bad).  It would be easier to just have a bulk mail machine to
handle all of that traffic and then the normal machine could handle
subscriptions and admin messages with ease, but the client won't pay an
additional monthly fee for a separate mail box.


% 
% thanks,
% 
% Luis


HTH  HAND  keep us posted :-)

:-D
-- 
David T-G  * There is too much animal courage in 
(play) [EMAIL PROTECTED] * society and not sufficient moral courage.
(work) [EMAIL PROTECTED]  -- Mary Baker Eddy, Science and Health
http://justpickone.org/davidtg/  Shpx gur Pbzzhavpngvbaf Qrprapl Npg!



pgp0.pgp
Description: PGP signature


Re: [PHP] High bandwidth application tips

2003-11-06 Thread David T-G
Eugene, et al --

...and then Eugene Lee said...
% 
...
% 
% Still, doing a greater-than or less-than comparison is a bit slow.
% You could try this:
% 
%   $upperlimit = 100;
%   while ($up-- != 0)
%   {

Hey, that's pretty slick.  And does that reduce to

  while ($up--)

as can be done in C?  [I always forget return codes and what is true
versus false :-]


TIA  HAND

:-D
-- 
David T-G  * There is too much animal courage in 
(play) [EMAIL PROTECTED] * society and not sufficient moral courage.
(work) [EMAIL PROTECTED]  -- Mary Baker Eddy, Science and Health
http://justpickone.org/davidtg/  Shpx gur Pbzzhavpngvbaf Qrprapl Npg!



pgp0.pgp
Description: PGP signature


Re: [PHP] High bandwidth application tips

2003-11-05 Thread Chris Shiflett
--- Luis Lebron [EMAIL PROTECTED] wrote:
 I am currently working on an application for a customer that may have
 a very large amount of users (10,000 or more according to the customer).

I currently design, develop, and maintain a suite of Web applications and
utilities that receive ten million hits a day, and my experience has shown
that the number one thing you can do to make the biggest difference is to
limit the number of times you need to hit the database.

PHP itself, even without an accelerator, is very fast. A single Web server
can handle most people's needs, so long as the application is designed
well otherwise.

As an example of limiting database access, consider whether your
application queries the database many times to receive the exact same
result set. Is there a way to cache that locally? Or, perhaps you are
generating statistics for some reason, where you need to record data for
every visitor. What if, instead, you stored such statistical data once for
every 100 visits? Assuming the rand() function is very good, this should
allow you to multiply your statistics by 100 and have fairly accurate
statistics (assuming large data sets, like saying you have 1.4 million
users from the US). Your accuracy diminishes by a factor of 100, so you
just have to determine what amount of accuracy you need.

There are a lot of things you can do, but I have found that performance
tuning your PHP logic can be very helpful, but it's nothing compared to
limiting your database access.

Hope that helps.

Chris

=
My Blog
 http://shiflett.org/
HTTP Developer's Handbook
 http://httphandbook.org/
RAMP Training Courses
 http://www.nyphp.org/ramp

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] High bandwidth application tips

2003-11-05 Thread Marco Tabini
Chris Shiflett wrote:
I currently design, develop, and maintain a suite of Web applications and
utilities that receive ten million hits a day, and my experience has shown
that the number one thing you can do to make the biggest difference is to
limit the number of times you need to hit the database.
I second Chris on this. No matter how fast your servers are, there is an 
inherent latency in hitting the database, particularly if you do so 
continuously and unnecessarily. Caching is usually a good solution to 
this problem--there are commercial products that will help you (like the 
Zend Performance Suite), but if you design with caching in mind from the 
outset, you should be able to obtain excellent results even without them.

Cheers,

Marco

--

php|architect - The Magazine for PHP Professionals
Try us free at http://www.phparch.com!
Complete searchable PHP mailing list archives at 
http://www.phparch.com/mailinglists

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


RE: [PHP] High bandwidth application tips

2003-11-05 Thread Jay Blanchard
[snip]
 limit the number of times you need to hit the database.
 
I second Chris on this.
[/snip]

I third that. The problem can become especially apparent in large
databases containing millions of records. Other than that just code
cleanly and document, document, document.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] High bandwidth application tips

2003-11-05 Thread Luis Lebron
One of the things I have suggested to the customer is offloading some of the
work to a different server. For example, he wants to email a weekly message
to all the users with information on records that match their search
criteria. I suggested setting up a second server that handles this and other
admin function while the main server takes care of the users.

Does this sound like a good idea?

thanks,

Luis

-Original Message-
From: Jay Blanchard [mailto:[EMAIL PROTECTED]
Sent: Wednesday, November 05, 2003 2:10 PM
To: Marco Tabini; [EMAIL PROTECTED]
Cc: Luis Lebron; Php-General (E-mail)
Subject: RE: [PHP] High bandwidth application tips


[snip]
 limit the number of times you need to hit the database.
 
I second Chris on this.
[/snip]

I third that. The problem can become especially apparent in large
databases containing millions of records. Other than that just code
cleanly and document, document, document.


RE: [PHP] High bandwidth application tips

2003-11-05 Thread Wouter van Vliet
That depends on the moment this emailing is sent, and how. If you do it at
night, when there's less visitors (assuming you run a site mostly meant for
ppl in 'your own country'), there's no real need to get a second server to
to this. 

I fourth the thing about database access. As long as you realize that
reading from disk isn't the fastest thing around either. Make sure you
reduce the number of files to be read to as little as possible. And output
with something like readfile() to prevent the files being loaded into
memory.

Some other things to think about (not only for the sake of performance, but
also readability):

* Use elseif or switch as much as possible rather than if statements
One time I had this script somebody else wrote. About 1000 lines, a complete
CMS in one file. It if'ed on simple $_GET vars about 10 times, by opening a
new if statement each and every time. After I changed this to if ($_GET['p']
== 'one') { .. } elseif ($_GET['p'] == 'two') { .. }; and so on the script
became TWO SECONDS faster. Switch { case 'one': ... }; was not an option
here, cuz there were more advanced expressions to be evaluated.

* Unset when not used anymore
If you load a large amount of data in a variable that exists untill the end
of your script but is used only untill a point higher, call unset($Var); to
free some memory

* Only code what is neccisery (still can't spell that word :D)
Logically, do not write something in 5 lines what can be done in 1. And only
create temp vars if you really need them. Usually this is when you use it's
value more than once. On the same hand, only call a function as much as you
need to. One time I found this in some code that reached me:

$ID = Array(1, 5, 3, 7, 3, 9);
$i = 1;
while($i = count($ID)) {
$query = SELECT * FROM table WHERE id = '.$ID[$i].';
$result = mysql_query($query);
$value = mysql_fetch_array($result);

/* .. Do stuff with $value .. */
}

I optimized it to:

$IDs = Array(1, 5, 3, 7, 3, 9);
$ResultSet = mysql_query('SELECT * FROM table WHERE id IN ('.join(',',
$IDs).')');
while($value = mysql_fetch_assoc($ResultSet)) {
/* .. Do stuff with $value .. */
}

Don't think I have to explain why the second version is quite faster.

Let me know if some of my lines were of any value to you.. Or anybody else
;)

Wouter

-Original Message-
From: Luis Lebron [mailto:[EMAIL PROTECTED] 
Sent: Wednesday 05 November 2003 21:12
To: 'Jay Blanchard'; Marco Tabini; [EMAIL PROTECTED]
Cc: Luis Lebron; Php-General (E-mail)
Subject: RE: [PHP] High bandwidth application tips

One of the things I have suggested to the customer is offloading some of the
work to a different server. For example, he wants to email a weekly message
to all the users with information on records that match their search
criteria. I suggested setting up a second server that handles this and other
admin function while the main server takes care of the users.

Does this sound like a good idea?

thanks,

Luis

-Original Message-
From: Jay Blanchard [mailto:[EMAIL PROTECTED]
Sent: Wednesday, November 05, 2003 2:10 PM
To: Marco Tabini; [EMAIL PROTECTED]
Cc: Luis Lebron; Php-General (E-mail)
Subject: RE: [PHP] High bandwidth application tips


[snip]
 limit the number of times you need to hit the database.
 
I second Chris on this.
[/snip]

I third that. The problem can become especially apparent in large databases
containing millions of records. Other than that just code cleanly and
document, document, document.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] High bandwidth application tips

2003-11-05 Thread Chris W. Parker
Wouter van Vliet mailto:[EMAIL PROTECTED]
on Wednesday, November 05, 2003 5:06 PM said:

 * Unset when not used anymore

I don't do this enough.

One thing you can do to make loops faster is the following: (yes I've
posted this in the past!!)

Unoptimized:

$upperlimit = 100;

for($i = 0;$i  $upperlimit; $++)
{
// your stuff
}

Optimized:

$upperlimit = 100;
$i = -1;

while(++$i  $upperlimit)
{
// your stuff
}


Chris.
--
Don't like reformatting your Outlook replies? Now there's relief!
http://home.in.tum.de/~jain/software/outlook-quotefix/

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] High bandwidth application tips

2003-11-05 Thread Wouter van Vliet
Whow, is that way of a loop really faster? I mean .. It looks as if the
accept same thing happens
- $upperlimit is set
- $i counter is set
- for every loop $i is set one higher
- Also for every loop the expression ($i$upperlimit) is evaluated.

Any idea on why it's faster?

-Original Message-
From: Chris W. Parker [mailto:[EMAIL PROTECTED] 
Sent: Thursday 06 November 2003 02:17
To: Php-General (E-mail)
Subject: RE: [PHP] High bandwidth application tips

Wouter van Vliet mailto:[EMAIL PROTECTED]
on Wednesday, November 05, 2003 5:06 PM said:

 * Unset when not used anymore

I don't do this enough.
 Me neither, to be honest ;)

One thing you can do to make loops faster is the following: (yes I've posted
this in the past!!)

Unoptimized:

$upperlimit = 100;

for($i = 0;$i  $upperlimit; $++)
{
// your stuff
}

Optimized:

$upperlimit = 100;
$i = -1;

while(++$i  $upperlimit)
{
// your stuff
}


Chris.
--
Don't like reformatting your Outlook replies? Now there's relief!
http://home.in.tum.de/~jain/software/outlook-quotefix/

--
PHP General Mailing List (http://www.php.net/) To unsubscribe, visit:
http://www.php.net/unsub.php

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] High bandwidth application tips

2003-11-05 Thread Mike Migurski
Whow, is that way of a loop really faster? I mean .. It looks as if the
accept same thing happens
   - $upperlimit is set
   - $i counter is set
   - for every loop $i is set one higher
   - Also for every loop the expression ($i$upperlimit) is evaluated.

Not sure, but I just did a quick test - over one million iterations, it
was 0.47 seconds faster. It's up you to decide whether that's going to
affect you much. :)

I'd stick with Chris' suggestion to minimize database access, since that's
what will really nail you for busy sites. Scraping together milliseconds
here and there is fine if you're really against the wall, but I imagine
that the benefit could be more than offset for by comparing the value of
your time as a developer to the cost of a slightly faster server.

-
michal migurski- contact info and pgp key:
sf/cahttp://mike.teczno.com/contact.html

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] High bandwidth application tips

2003-11-05 Thread Chris Shiflett
--- Wouter van Vliet [EMAIL PROTECTED] wrote:
 One time I had this script somebody else wrote. About 1000 lines, a
 complete CMS in one file. It if'ed on simple $_GET vars about 10 times,
 by opening a new if statement each and every time. After I changed this
 to if ($_GET['p'] == 'one') { .. } elseif ($_GET['p'] == 'two') { .. };
 and so on the script became TWO SECONDS faster.

Another minor performance tip to add to this is to use $p in your
conditional statements rather than $_GET['p']. Of course, you should
filter this data also, but try something like this:

if ($_GET['p'] is valid data)
{
 $p = $_GET['p'];
}

When you use $p in the rest of the script, you save PHP the small
discovery time required for $_GET['p']. If you have many conditional
statements, this minor improvement can be amplified slightly, and if you
have many users (a few million a day, for example), this improvement is
further amplified.

Hope that helps.

Chris

=
My Blog
 http://shiflett.org/
HTTP Developer's Handbook
 http://httphandbook.org/
RAMP Training Courses
 http://www.nyphp.org/ramp

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] High bandwidth application tips

2003-11-05 Thread John Herren
Jay Blanchard wrote:

[snip]

limit the number of times you need to hit the database.

I second Chris on this.
[/snip]
I third that. The problem can become especially apparent in large
databases containing millions of records. Other than that just code
cleanly and document, document, document.
Creating indexes on columns that appear in your WHERE clauses can 
drastically increase performance when hitting the database. Be sure to 
read your database documentation on creating indexes.

A simple example of this would be a forgot your password script that 
emailed a user based on the user's email address. By indexing the email 
address field, your query will run much faster than without the index, 
and if you have several thousand records in your table, you will surely 
notice the difference.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


RE: [PHP] High bandwidth application tips

2003-11-05 Thread olinux

[snip]

 I fourth the thing about database access. As long as
 you realize that
 reading from disk isn't the fastest thing around
 either. Make sure you
 reduce the number of files to be read to as little
 as possible. And output
 with something like readfile() to prevent the files
 being loaded into
 memory.

[/snip]

A filesystem hit is a filesystem hit whether your
requesting a php file or an image for a button. If you
are worried about filesystem hits then shouldn't you
also be worried about uneccessarily using GIF's etc.
in your page layouts. Likewise cleaning up bloated
HTML code and properly using CSS can cut down page
filesizes dramatically, saving bandwidth for the
server and clients. If users are potentially using
dialup, cutting 20K off your pages will make them a
lot happier than shaving a couple tenths of a second
off the backend processes. (not saying you should not
be performance focused on the backend as well.)

olinux



__
Do you Yahoo!?
Protect your identity with Yahoo! Mail AddressGuard
http://antispam.yahoo.com/whatsnewfree

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php