El Jue 25 Sep 2003 13:08, Dave [Hawk-Systems] escribi�:
> we have a number of csv dumps that occasionally have to be used to update
> tables in a postgres database...
>
> normally this is done by uploading the file and whomever running a php
> based parser which opens the file into an array, does a split on the comma,
> then executes an insert or update to the database with the split array
> values.
>
> we have a new upload that involves a 143mb file, compared to previous
> upload sizes of 10k to 150k. Is there a more efficient way to handle this
> rather than having PHP load the entire file into an array (which would have
> to be in memory during the operation, correct?).
I guess you are using file() to read the entire file. Try using fopen() and
read line by line with fread().
--
Porqu� usar una base de datos relacional cualquiera,
si pod�s usar PostgreSQL?
-----------------------------------------------------------------
Mart�n Marqu�s | [EMAIL PROTECTED]
Programador, Administrador, DBA | Centro de Telematica
Universidad Nacional
del Litoral
-----------------------------------------------------------------
--
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php