I'm working on a web application and one of the things I am doing is creating an archiving function that would move older data to archive tables in order to minimise the amount of data in the active tables. This so that the data that is being used more frequently can be accessed faster by the users.

My approach in building the archive function is:

1) SELECT query on the data
2) mysql_fetch_array to put the data into an array
3) INSERT subqueries to put the data into the archive tables.

My concern is that in some cases, hundreds of rows of data would need to be moved - which could lead to awfully big arrays. However, the archiving function is likely to be used infrequently - not more than 1 or 2 times per week.

This leads to two questions:

1) Could such a big array cause performance problems or worse?
2) Is there a better way?

Many thanks,

Jeff

--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Reply via email to