Re: [PHP-DB] managing large result sets

2001-03-29 Thread Manuel Lemos

Hello,

houston wrote:
> 
> I have to deal with displaying result sets that could potentially be quite
> large.  It would be bad form to drop 10,000 rows into a browser table so I
> need a strategy for dealing with this possibility.
> 
> It's obviously not a big deal to limit the output but what is the typical
> strategy for handling   behavior?  It seems terribly
> inefficient to requery for a subset each time.  I suppose that I could dump
> the entire result set to a text file keyed to the session but that raises
> the problem of garbage collection with potentially large files.  Anyone have
> experience with this problem?

It is an everyday problem with Web programming. Regardless of the
database you are using, you may want to try Metabase which is a PHP
database abstraction package that lets you specify a range of rows to
retrieve from the server. All you need to do is call
MetabaseSetSelectedRowRange($database,$first,$limit) before you execute
a query. It will return result rows starting row $first upto the a
$limit number of rows. It works like MySQL LIMIT clause, except that it
works all supported databases (Oracle, MS SQL, Informix, Interbase,
PostgreSQL, etc...).

http://phpclasses.UpperDesign.com/browse.html/package/20

There is also a PHP classes that takes advantage of this capability of
Metabase to display query results in HTML tables that may appear split
between different pages that you may navigate with links that the classe
automatically generates for you near the HTML tables.

http://phpclasses.UpperDesign.com/browse.html/package/130


Manuel Lemos

-- 
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




Re: [PHP-DB] managing large result sets

2001-03-28 Thread php3

Addressed to: "houston" <[EMAIL PROTECTED]>
  [EMAIL PROTECTED]

** Reply to note from "houston" <[EMAIL PROTECTED]> Wed, 28 Mar 2001 
12:14:51 -0600

>
> I have to deal with displaying result sets that could potentially be quite
> large.  It would be bad form to drop 10,000 rows into a browser table so I
> need a strategy for dealing with this possibility.
>
> It's obviously not a big deal to limit the output but what is the typical
> strategy for handling   behavior?  It seems terribly
> inefficient to requery for a subset each time.  I suppose that I could dump
> the entire result set to a text file keyed to the session but that raises
> the problem of garbage collection with potentially large files.  Anyone have
> experience with this problem?

It may seen inefficient, but a new query each time is the way to handle
it.  If you are using MySQL look at the LIMIT clause.


   SELECT whatever FROM someTable LIMIT start, count

Start is the record number of the first entry to return.
Count is how many to return.

I believe you will find this is not as inefficient as it may seem as it
is a common thing to do and the people who wrote MySQL know it.  I
believe you can count on the database to cache the result set for you,
and handle garbage collection if you don't hit it often enough.




Rick Widmer
Internet Marketing Specialists
http://www.developersdesk.com

--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]