If you are using php and a database you can add more memory to the script
and optimize the database. I only use postgres databases for all my large
data so I can let you know how to optimize postgres...
From: Aron Pilhofer [mailto:[EMAIL PROTECTED]]
Sent: Monday, March 04, 2002 9:02 AM
To: [EMAIL PROTECTED]
Subject: [PHP-DB] optimization (another tack)
Let me try this again more generally. I am trying to optimize a function in
PHP that handles very large result sets, which are transferred to arrays,
and that does some extremely heavy lifting in terms of calculations on those
arrays. By design, it iterates through each possible combination of two
result sets, and does some calculations on those results. As you can
imagine, the numbers get quite large, quite fast; sets of 500 by 1000
necessitate a half-million calculations.
So, short of rewriting this function in C, which I cannot do, are there any
suggestions for optimizing. For example:
1) is there any advantage to caching an array as a local file?
2) the script pumps the results of the calculations into a new table.. would
it be faster to dump it into a local file instead?
3) is there any advantage to executing the script as a CGI? (does that make
sense? I don't know if I know the correct jargon here...)
Any other tips folks have for scripts that handle a lot of calculations
would be greatly appreciated.
Thanks in advance.
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php