Radu Manole wrote:
Hi all,
I do have a question about optimizing the php for large applications.
Many applications group the functions in files (eg. functions.inc.php) or
build classes, and these files/classes are called with 'require' or
'include' on the top of each main file.
What would be the
On Wed, Nov 05, 2003 at 05:17:29PM -0800, Chris W. Parker wrote:
:
: One thing you can do to make loops faster is the following: (yes I've
: posted this in the past!!)
:
: Unoptimized:
:
: $upperlimit = 100;
:
: for($i = 0;$i $upperlimit; $++)
: {
: // your stuff
: }
:
: Optimized:
:
Tabini'; [EMAIL PROTECTED]
Cc: 'Php-General (E-mail)'
Subject: RE: [PHP] High bandwidth application tips
That depends on the moment this emailing is sent, and how. If you do it at
night, when there's less visitors (assuming you run a site mostly meant for
ppl in 'your own country'), there's no real
From: Luis Lebron [EMAIL PROTECTED]
Excellent tips. I think I'm really going to have to polish my sql skills
for
this task. Any good tools for benchmarking sql queries?
If you've been following the Load Stress Tool thead, this program:
http://jakarta.apache.org/jmeter/index.html was mentioned.
-Original Message-
From: olinux [mailto:[EMAIL PROTECTED]
Sent: Thursday 06 November 2003 07:28
To: [EMAIL PROTECTED]
Subject: RE: [PHP] High bandwidth application tips
[snip]
I fourth the thing about database access. As long as you realize that
reading from disk isn't the fastest thing around
Hi all,
I do have a question about optimizing the php for large applications.
Many applications group the functions in files (eg. functions.inc.php) or
build classes, and these files/classes are called with 'require' or
'include' on the top of each main file.
What would be the speed penalty if
--- Luis Lebron [EMAIL PROTECTED] wrote:
Any good tools for benchmarking sql queries?
This may not directly answer your question, but I find the mytop utility
very helpful for seeing what is happening with a MySQL server. It's
available here:
http://jeremy.zawodny.com/mysql/mytop/
Another
-General (E-mail)'
Subject: RE: [PHP] High bandwidth application tips
--- Luis Lebron [EMAIL PROTECTED] wrote:
Any good tools for benchmarking sql queries?
This may not directly answer your question, but I find the mytop utility
very helpful for seeing what is happening with a MySQL server. It's
--- Luis Lebron [EMAIL PROTECTED] wrote:
I guess there is a configuration option in my.cnf for logging slow
queries?
There, or you can pass it in as a command line argument when you start the
server. Here is a good URL for more information:
http://www.mysql.com/doc/en/Slow_query_log.html
You
Luis, et al --
...and then Luis Lebron said...
%
% One of the things I have suggested to the customer is offloading some of the
% work to a different server. For example, he wants to email a weekly message
...
% Does this sound like a good idea?
If you can afford more servers, it's almost
Eugene, et al --
...and then Eugene Lee said...
%
...
%
% Still, doing a greater-than or less-than comparison is a bit slow.
% You could try this:
%
% $upperlimit = 100;
% while ($up-- != 0)
% {
Hey, that's pretty slick. And does that reduce to
while ($up--)
as can be
I have been coding php for a few years now, but I haven't build a high
bandwidth application. I am currently working on an application for a
customer that may have a very large amount of users (10,000 or more
according to the customer). Are there any good reference books, articles
and general
--- Luis Lebron [EMAIL PROTECTED] wrote:
I am currently working on an application for a customer that may have
a very large amount of users (10,000 or more according to the customer).
I currently design, develop, and maintain a suite of Web applications and
utilities that receive ten million
Chris Shiflett wrote:
I currently design, develop, and maintain a suite of Web applications and
utilities that receive ten million hits a day, and my experience has shown
that the number one thing you can do to make the biggest difference is to
limit the number of times you need to hit the
[snip]
limit the number of times you need to hit the database.
I second Chris on this.
[/snip]
I third that. The problem can become especially apparent in large
databases containing millions of records. Other than that just code
cleanly and document, document, document.
--
PHP General Mailing
-mail)
Subject: RE: [PHP] High bandwidth application tips
[snip]
limit the number of times you need to hit the database.
I second Chris on this.
[/snip]
I third that. The problem can become especially apparent in large
databases containing millions of records. Other than that just code
cleanly
:[EMAIL PROTECTED]
Sent: Wednesday 05 November 2003 21:12
To: 'Jay Blanchard'; Marco Tabini; [EMAIL PROTECTED]
Cc: Luis Lebron; Php-General (E-mail)
Subject: RE: [PHP] High bandwidth application tips
One of the things I have suggested to the customer is offloading some of the
work to a different server
Wouter van Vliet mailto:[EMAIL PROTECTED]
on Wednesday, November 05, 2003 5:06 PM said:
* Unset when not used anymore
I don't do this enough.
One thing you can do to make loops faster is the following: (yes I've
posted this in the past!!)
Unoptimized:
$upperlimit = 100;
for($i = 0;$i
faster?
-Original Message-
From: Chris W. Parker [mailto:[EMAIL PROTECTED]
Sent: Thursday 06 November 2003 02:17
To: Php-General (E-mail)
Subject: RE: [PHP] High bandwidth application tips
Wouter van Vliet mailto:[EMAIL PROTECTED]
on Wednesday, November 05, 2003 5:06 PM said
Whow, is that way of a loop really faster? I mean .. It looks as if the
accept same thing happens
- $upperlimit is set
- $i counter is set
- for every loop $i is set one higher
- Also for every loop the expression ($i$upperlimit) is evaluated.
Not sure, but I just did
--- Wouter van Vliet [EMAIL PROTECTED] wrote:
One time I had this script somebody else wrote. About 1000 lines, a
complete CMS in one file. It if'ed on simple $_GET vars about 10 times,
by opening a new if statement each and every time. After I changed this
to if ($_GET['p'] == 'one') { .. }
Jay Blanchard wrote:
[snip]
limit the number of times you need to hit the database.
I second Chris on this.
[/snip]
I third that. The problem can become especially apparent in large
databases containing millions of records. Other than that just code
cleanly and document, document, document.
[snip]
I fourth the thing about database access. As long as
you realize that
reading from disk isn't the fastest thing around
either. Make sure you
reduce the number of files to be read to as little
as possible. And output
with something like readfile() to prevent the files
being loaded
23 matches
Mail list logo