Addressed to: "houston" <[EMAIL PROTECTED]>
              [EMAIL PROTECTED]

** Reply to note from "houston" <[EMAIL PROTECTED]> Wed, 28 Mar 2001 
12:14:51 -0600

>
> I have to deal with displaying result sets that could potentially be quite
> large.  It would be bad form to drop 10,000 rows into a browser table so I
> need a strategy for dealing with this possibility.
>
> It's obviously not a big deal to limit the output but what is the typical
> strategy for handling <next> <previous> behavior?  It seems terribly
> inefficient to requery for a subset each time.  I suppose that I could dump
> the entire result set to a text file keyed to the session but that raises
> the problem of garbage collection with potentially large files.  Anyone have
> experience with this problem?

It may seen inefficient, but a new query each time is the way to handle
it.  If you are using MySQL look at the LIMIT clause.


   SELECT whatever FROM someTable LIMIT start, count

Start is the record number of the first entry to return.
Count is how many to return.

I believe you will find this is not as inefficient as it may seem as it
is a common thing to do and the people who wrote MySQL know it.  I
believe you can count on the database to cache the result set for you,
and handle garbage collection if you don't hit it often enough.




Rick Widmer
Internet Marketing Specialists
http://www.developersdesk.com

--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]

Reply via email to