I've written and automated system that does the following:: similar problem

** every evening a cron job executes on LAN linux and spools of prices into a csv text file
** The file is ftp/uploaded to remote server on web and a database entry created to show file
** cron job on remote server checks database every half hour for new entries
** if new entry exists then file is imported into database
** all uploaded files over a week olde are deleted

hope it helps


Dave wrote:
we have a number of csv dumps that occasionally have to be used to update tables
in a postgres database...

normally this is done by uploading the file and whomever running a php based
parser which opens the file into an array, does a split on the comma, then
executes an insert or update to the database with the split array values.

we have a new upload that involves a 143mb file, compared to previous upload
sizes of 10k to 150k.  Is there a more efficient way to handle this rather than
having PHP load the entire file into an array (which would have to be in memory
during the operation, correct?).



-- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to