Okay...glad to see someone put some thought into it instead of just wanting
to do it because "queries are bad!". Also, the speed of the query doesn't
depend on the connection speed at all.

So, to solve your problem, load it into a session array.

session_start();
$result = mysql_query("...");
while($row = mysql_fetch_row($result))
{
  $_SESSION['result']['column0'][] = $row[0];
  $_SESSION['result']['column1'][] = $row[1];
  etc...
}

Then, set a $Page variable and use that (times 30), as the index for the
session array.

$Start = $Page * 30;
$End = $Start + 30;
for($x=$Start;$x<$End;$x++);
{
  echo $_SESSION['result']['column0'][$x];
  echo $_SESSION['result']['column1'][$x];
  etc...
}

Adapt to your needs.

Or maybe you could load the results into a temporary table and do the
subsequent paging out of that table (which will be faster, only having 3-5K
rows). run a cron job to delete temp tables after the are X minutes old,
etc...

---John Holmes...

----- Original Message -----
From: "Jay Blanchard" <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Monday, June 03, 2002 3:07 PM
Subject: RE: [PHP] Previous & Next Navigation


> [snip]
> So you think it's more efficient and faster to load a 3 - 5 thousand row
> table into an array in memory and pass that around to all of your scripts
> (through sessions?), rather than just passing a $page variable and doing a
> query to return 30 rows on each page??
>
> If you pass a $Page variable, you can make your query like this:
>
> SELECT * FROM table LIMIT $Page*30,30
>
> Just increment and decriment $Page as you traverse the results...easy, eh?
> [/snip]
>
> It's definitely faster, as for more efficient I would have to do
benchmarks.
> The original table consists of millions of rows and each time you query
with
> LIMIT the query traverses the entire set of records in the data to get the
> proper CONDITIONS. Given that there are 3k -5k rows amongst the millions
> this requires a lot of search time for each query. The memory footprint of
> the 3k - 5k of records, even if the total memory needed for each record is
> 1k (which it is not), is 30k - 50k RAM, less than the size of most web
> pages. The LIMIT query, running on a slow server to simulate dial-up
> connections, takes anywhere from 1.3 to 2.2 minutes (been timing it a lot
> today) to execute. Since efficiency is often lumped in with speed, I would
> have to surmise that using an array in this instance would be more
efficient
> as well.
>
> Thanks!
>
> Jay
>
>
>
>
>
>
> --
> PHP General Mailing List (http://www.php.net/)
> To unsubscribe, visit: http://www.php.net/unsub.php
>


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to