On Apr 4, 4:51 pm, AD7six <[EMAIL PROTECTED]> wrote:
> On Apr 4, 4:08 am, Grant Cox <[EMAIL PROTECTED]> wrote:
>
> > We have some large reports for our application - often involving ~250K
> > rows from 8 different tables. The problem is, if your reporting needs
> > to do any kind of collating or comparison, as ours does, you really
> > can't avoid having all the data loaded somewhere. You may be able to
> > do much of it in your database which should stop PHP from running out
> > of memory, but it'll have to be used somewhere.
>
> If you can get the db to do everything, everyone's a winner.
> e.g.http://forums.mysql.com/read.php?79,11324,13062#msg-13062
>
> If you can't get the db to do everything (because the logic isn't all
> in the db, or it's too complex to do so, or because, because etc.), if
> you generate your output in chunks (standard batch processing afaic)
> and dump it into a temporary table - then you can do as above and
> again: everyone's a winner :).
>
> Then you only need to send the file to the user, or redirect the user
> to the right location. For a "standard" csv dump you can reduce your
> php processing time and memory requirements to practically 0 in this
> way.
But, that gonna increase the disk space requirements:-( I
considered all known options; but record-by-record dumping is the only
workable approach as far as I see.
--
<?php echo 'Just another PHP saint'; ?>
Email: rrjanbiah-at-Y!com Blog: http://rajeshanbiah.blogspot.com/
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups "Cake
PHP" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at
http://groups.google.com/group/cake-php?hl=en
-~----------~----~----~----~------~----~------~--~---