----- Original Message -----
From: "Steve Clay" <[EMAIL PROTECTED]>
Sent: Thursday, March 21, 2002 4:54 PM
Subject: [PHP] eating mySQL result rows 1 by 1.. a better way?

> On my site I paginate query results by limiting rows output to a
> value, say LIMIT, and then the 2nd, 3rd pages run the same query with
> $skip=LIMIT, $skip=(LIMIT*2) value posted back.  I use the following
> code to "skip" these result rows, which is just fetching the next row
> to an unused array.
> //if there are rows to skip
> if ($result_rows > $rows_to_skip) {
>    while ( $rows_to_skip ) {
>       // eat a row
>       mysql_fetch_array($result);
>       $rows_to_skip--;
>       $total_results_shown++;
>    }
> }
> Can I make this more efficient?  Is there a way to eliminate this data
> before it leaves the mySQL server (and would it be faster)?

try 'LIMIT' :)


The LIMIT clause can be used to constrain the number of rows returned by the
SELECT statement. LIMIT takes one or two numeric arguments. If two arguments
are given, the first specifies the offset of the first row to return, the
second specifies the maximum number of rows to return. The offset of the
initial row is 0 (not 1):
mysql> select * from table LIMIT 5,10;  # Retrieve rows 6-15
if one argument is given, it indicates the maximum number of rows to return:
mysql> select * from table LIMIT 5;     # Retrieve first 5 rows
In other words, LIMIT n is equivalent to LIMIT 0,n.

Rod Kreisler wrote a nice article on Building Next/Prev Buttons for Query

hope it helps :)

Joffrey van Wageningen

.-[ Joffrey van Wageningen | WoLFjuh | [EMAIL PROTECTED] ]--------------
| Networking Event 2000 - www.ne2000.nl - IRCnet:#ne2000, Undernet:#clue
| PGP:1024D/C6BA5863 - 3B93 52D3 CB91 9CB7 C50D FA79 865F 628A C6BA 5863
| * We demand guaranteed rigidly defined areas of doubt and uncertainty.
|                                                       -- Douglas Adams

PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to