0.9 seconds doesn't sounds too bad to me! Although, you haven't told us how big each record is, only that you have 2500 of them ! So we cannot guess. However, I would investigate the PHP gzip functions along with ob_start() so that your output can be zipped before sending to the client.

Is the entire record data, or does is consist of javascript fluff and other furniture which might be better driven by client side DHTML processing ?

Also with 2500 records I would make a more efficient search algorithm, or page your output to say 25 pages of 100 records each. If your query is often returning the same data, cache the output as a file on the server instead, and update the file only when you apply updates to the database.

Cheers - Neil.

At 22:11 21/12/2003 +0000, you wrote:
From: "Robin Kopetzky" <[EMAIL PROTECTED]>
Date: Sun, 21 Dec 2003 15:15:35 -0700
MIME-Version: 1.0
Content-Type: text/plain;
Content-Transfer-Encoding: 7bit
Subject: PHP/DB speed

Good afternoon!

I am writing a project and have a speed concern...

The code I am using is thus and is retrieving around 2,500 records:

        $result = mysql_query($sql)
        while ($row = mysql_fetch_array($result))
                build <OPTION> stmt

        Is there a faster method? Timed this with microtime and .9 seconds to
retrieve the data and output the web page seems REAL slow. Now this is on a
T-Base-100 network but I imagine it would be like watching paint dry in a
56K modem. Any thoughts, ideas on accelerating this? I did try ob_start()
and ob_end_flush() and no help...

Thank for any help in advance.

Robin 'Sparky' Kopetzky
Black Mesa Computers/Internet Service
Grants, NM 87020

-- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to