Re: [PHP] Previous Next Navigation

2002-06-03 Thread 1LT John W. Holmes

So you think it's more efficient and faster to load a 3 - 5 thousand row
table into an array in memory and pass that around to all of your scripts
(through sessions?), rather than just passing a $page variable and doing a
query to return 30 rows on each page??

If you pass a $Page variable, you can make your query like this:

SELECT * FROM table LIMIT $Page*30,30

Just increment and decriment $Page as you traverse the results...easy, eh?

---John Holmes...

- Original Message -
From: Jay Blanchard [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Monday, June 03, 2002 1:06 PM
Subject: [PHP] Previous  Next Navigation


 Howdy!

 I am working on a project where approximately 3k - 5k records are returned
 and need to be displayed 30 per page with 'previous' and 'next' navigation
 at the request of the users. Does anyone know of an efficient script that
 will do this in PHP (with MySQL) that will not query the database every
 time? I would like to place the records into an array after one query to
the
 database and then navigate the array...both for efficiency and speed.

 Anyhow, if anyone knows, please let me know. Otherwise I will have to
write
 what I am looking for, and I really have no desire to re-invent the whell
on
 a Monday afternoon.

 TIA!

 Jay



 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




RE: [PHP] Previous Next Navigation

2002-06-03 Thread Jay Blanchard

[snip]
So you think it's more efficient and faster to load a 3 - 5 thousand row
table into an array in memory and pass that around to all of your scripts
(through sessions?), rather than just passing a $page variable and doing a
query to return 30 rows on each page??

If you pass a $Page variable, you can make your query like this:

SELECT * FROM table LIMIT $Page*30,30

Just increment and decriment $Page as you traverse the results...easy, eh?
[/snip]

It's definitely faster, as for more efficient I would have to do benchmarks.
The original table consists of millions of rows and each time you query with
LIMIT the query traverses the entire set of records in the data to get the
proper CONDITIONS. Given that there are 3k -5k rows amongst the millions
this requires a lot of search time for each query. The memory footprint of
the 3k - 5k of records, even if the total memory needed for each record is
1k (which it is not), is 30k - 50k RAM, less than the size of most web
pages. The LIMIT query, running on a slow server to simulate dial-up
connections, takes anywhere from 1.3 to 2.2 minutes (been timing it a lot
today) to execute. Since efficiency is often lumped in with speed, I would
have to surmise that using an array in this instance would be more efficient
as well.

Thanks!

Jay






-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




Re: [PHP] Previous Next Navigation

2002-06-03 Thread 1LT John W. Holmes

Okay...glad to see someone put some thought into it instead of just wanting
to do it because queries are bad!. Also, the speed of the query doesn't
depend on the connection speed at all.

So, to solve your problem, load it into a session array.

session_start();
$result = mysql_query(...);
while($row = mysql_fetch_row($result))
{
  $_SESSION['result']['column0'][] = $row[0];
  $_SESSION['result']['column1'][] = $row[1];
  etc...
}

Then, set a $Page variable and use that (times 30), as the index for the
session array.

$Start = $Page * 30;
$End = $Start + 30;
for($x=$Start;$x$End;$x++);
{
  echo $_SESSION['result']['column0'][$x];
  echo $_SESSION['result']['column1'][$x];
  etc...
}

Adapt to your needs.

Or maybe you could load the results into a temporary table and do the
subsequent paging out of that table (which will be faster, only having 3-5K
rows). run a cron job to delete temp tables after the are X minutes old,
etc...

---John Holmes...

- Original Message -
From: Jay Blanchard [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Monday, June 03, 2002 3:07 PM
Subject: RE: [PHP] Previous  Next Navigation


 [snip]
 So you think it's more efficient and faster to load a 3 - 5 thousand row
 table into an array in memory and pass that around to all of your scripts
 (through sessions?), rather than just passing a $page variable and doing a
 query to return 30 rows on each page??

 If you pass a $Page variable, you can make your query like this:

 SELECT * FROM table LIMIT $Page*30,30

 Just increment and decriment $Page as you traverse the results...easy, eh?
 [/snip]

 It's definitely faster, as for more efficient I would have to do
benchmarks.
 The original table consists of millions of rows and each time you query
with
 LIMIT the query traverses the entire set of records in the data to get the
 proper CONDITIONS. Given that there are 3k -5k rows amongst the millions
 this requires a lot of search time for each query. The memory footprint of
 the 3k - 5k of records, even if the total memory needed for each record is
 1k (which it is not), is 30k - 50k RAM, less than the size of most web
 pages. The LIMIT query, running on a slow server to simulate dial-up
 connections, takes anywhere from 1.3 to 2.2 minutes (been timing it a lot
 today) to execute. Since efficiency is often lumped in with speed, I would
 have to surmise that using an array in this instance would be more
efficient
 as well.

 Thanks!

 Jay






 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




RE: [PHP] Previous Next Navigation

2002-06-03 Thread Jason Soza

I don't know if it's possible (or faster, or more efficient), but could 
you query your millions of records for those 3k - 5k, insert them into 
a temp table, then do your LIMIT queries on that table for your 
prev/next pages?

Just an idea!

Jason Soza
Juneau, Alaska

- Original Message -
From: Jay Blanchard [EMAIL PROTECTED]
Date: Monday, June 3, 2002 11:07 am
Subject: RE: [PHP] Previous  Next Navigation

 [snip]
 So you think it's more efficient and faster to load a 3 - 5 
 thousand row
 table into an array in memory and pass that around to all of your 
 scripts(through sessions?), rather than just passing a $page 
 variable and doing a
 query to return 30 rows on each page??
 
 If you pass a $Page variable, you can make your query like this:
 
 SELECT * FROM table LIMIT $Page*30,30
 
 Just increment and decriment $Page as you traverse the 
 results...easy, eh?
 [/snip]
 
 It's definitely faster, as for more efficient I would have to do 
 benchmarks.The original table consists of millions of rows and 
 each time you query with
 LIMIT the query traverses the entire set of records in the data to 
 get the
 proper CONDITIONS. Given that there are 3k -5k rows amongst the 
 millionsthis requires a lot of search time for each query. The 
 memory footprint of
 the 3k - 5k of records, even if the total memory needed for each 
 record is
 1k (which it is not), is 30k - 50k RAM, less than the size of most web
 pages. The LIMIT query, running on a slow server to simulate dial-up
 connections, takes anywhere from 1.3 to 2.2 minutes (been timing 
 it a lot
 today) to execute. Since efficiency is often lumped in with speed, 
 I would
 have to surmise that using an array in this instance would be more 
 efficientas well.
 
 Thanks!
 
 Jay
 
 
 
 
 
 
 -- 
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php
 
 


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




RE: [PHP] Previous Next Navigation

2002-06-03 Thread Jay Blanchard

[snip]
Okay...glad to see someone put some thought into it instead of just wanting
to do it because queries are bad!. Also, the speed of the query doesn't
depend on the connection speed at all.
[/snip]

I know...in this case it is just the number of records that are needed vs.
the number of records in the table.

[snip]
Or maybe you could load the results into a temporary table and do the
subsequent paging out of that table (which will be faster, only having 3-5K
rows). run a cron job to delete temp tables after the are X minutes old,
etc...
[/snip]

This is where I am leaning right now, the LIMIT query would be much faster
than an array if this is the case.

Thanks John!

Jay



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




RE: [PHP] Previous Next Navigation

2002-06-03 Thread David Freeman


  The memory footprint of the 3k - 5k of records, even if the 
  total memory needed for each record is 1k (which it is not), 
  is 30k - 50k RAM, less than the size of most web pages. The 
  LIMIT query, running on a slow server to simulate dial-up 
  connections, takes anywhere from 1.3 to 2.2 minutes (been 
  timing it a lot today) to execute.

Ummm, maybe I've missed something here but I don't think a 'slow server'
is an effective simulation of a dial-up connection.  If your queries are
getting bogged down then a faster server is going to be an effective
solution.  A slow dial-up just slows the rate at which the end user can
receive the information but has no real effect on the speed at which the
server physically generates it (ie. Back-end processing).

If your query is as you describe then perhaps a more appropriate
approach would be to use a scratch table in your database and, after
having queried your large database, write the results back as a sub-set
in a different table - then just carry a pointer variable from page to
page within your site to query that scratch table - it should be a very
fast query as the results will be precisely what you need and ordered
ready to go.

CYA, Dave



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




Re: [PHP] Previous Next Navigation

2002-06-03 Thread Justin French

Only querying the database once would result in you having to set a massive
session, cookie, or something to hold the whole result in.

Doesn't sound right to me... for starters, 3000 records / 30 per page = 100
pages... it seems unlikely that every user will make it through the 100
pages, so they (or the server) would be carrying around useless data.

What you want to do is limit each query to 30 results on each page, using
MySQL's LIMIT.

SELECT FROM table WHERE something ORDER BY id LIMIT 0,30 (first page)
SELECT FROM table WHERE something ORDER BY id LIMIT 30,60 (2nd page)
etc.


Justin French



on 04/06/02 3:06 AM, Jay Blanchard ([EMAIL PROTECTED])
wrote:

 Howdy!
 
 I am working on a project where approximately 3k - 5k records are returned
 and need to be displayed 30 per page with 'previous' and 'next' navigation
 at the request of the users. Does anyone know of an efficient script that
 will do this in PHP (with MySQL) that will not query the database every
 time? I would like to place the records into an array after one query to the
 database and then navigate the array...both for efficiency and speed.
 
 Anyhow, if anyone knows, please let me know. Otherwise I will have to write
 what I am looking for, and I really have no desire to re-invent the whell on
 a Monday afternoon.
 
 TIA!
 
 Jay
 
 


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php




RE: [PHP] Previous Next Navigation

2002-06-03 Thread Martin Towell

If you don't want to query the database again, you could always write the
results to a file.
There's a matter of cleaning up old cache files though, but that's not what
was asked... ;D

-Original Message-
From: Justin French [mailto:[EMAIL PROTECTED]]
Sent: Tuesday, June 04, 2002 11:40 AM
To: Jay Blanchard; [EMAIL PROTECTED]
Subject: Re: [PHP] Previous  Next Navigation


Only querying the database once would result in you having to set a massive
session, cookie, or something to hold the whole result in.

Doesn't sound right to me... for starters, 3000 records / 30 per page = 100
pages... it seems unlikely that every user will make it through the 100
pages, so they (or the server) would be carrying around useless data.

What you want to do is limit each query to 30 results on each page, using
MySQL's LIMIT.

SELECT FROM table WHERE something ORDER BY id LIMIT 0,30 (first page)
SELECT FROM table WHERE something ORDER BY id LIMIT 30,60 (2nd page)
etc.


Justin French



on 04/06/02 3:06 AM, Jay Blanchard ([EMAIL PROTECTED])
wrote:

 Howdy!
 
 I am working on a project where approximately 3k - 5k records are returned
 and need to be displayed 30 per page with 'previous' and 'next' navigation
 at the request of the users. Does anyone know of an efficient script that
 will do this in PHP (with MySQL) that will not query the database every
 time? I would like to place the records into an array after one query to
the
 database and then navigate the array...both for efficiency and speed.
 
 Anyhow, if anyone knows, please let me know. Otherwise I will have to
write
 what I am looking for, and I really have no desire to re-invent the whell
on
 a Monday afternoon.
 
 TIA!
 
 Jay
 
 


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php