houston wrote:
> I have to deal with displaying result sets that could potentially be quite
> large.  It would be bad form to drop 10,000 rows into a browser table so I
> need a strategy for dealing with this possibility.
> It's obviously not a big deal to limit the output but what is the typical
> strategy for handling <next> <previous> behavior?  It seems terribly
> inefficient to requery for a subset each time.  I suppose that I could dump
> the entire result set to a text file keyed to the session but that raises
> the problem of garbage collection with potentially large files.  Anyone have
> experience with this problem?

It is an everyday problem with Web programming. Regardless of the
database you are using, you may want to try Metabase which is a PHP
database abstraction package that lets you specify a range of rows to
retrieve from the server. All you need to do is call
MetabaseSetSelectedRowRange($database,$first,$limit) before you execute
a query. It will return result rows starting row $first upto the a
$limit number of rows. It works like MySQL LIMIT clause, except that it
works all supported databases (Oracle, MS SQL, Informix, Interbase,
PostgreSQL, etc...).


There is also a PHP classes that takes advantage of this capability of
Metabase to display query results in HTML tables that may appear split
between different pages that you may navigate with links that the classe
automatically generates for you near the HTML tables.


Manuel Lemos

PHP Database Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]

Reply via email to