we have a number of csv dumps that occasionally have to be used to update tables
in a postgres database...

normally this is done by uploading the file and whomever running a php based
parser which opens the file into an array via file(), does a split() on the
comma, then executes an insert or update to the database with the split array
values.

we have a new upload that involves a 143mb file, compared to previous upload
sizes of 10k to 150k.  Is there a more efficient way to handle this rather than
having PHP load the entire file into an array (which would have to be in memory
during the operation, correct?).  Perhaps fopen and reading line by line, or
would that be the same load?

thanks

Dave

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to