Why not use an actual SQL query which is optimised to be faster than string processing ?

You would probably get more mileage from loading the CSV into a pre-formed table with an index, in MySQL (this operation is very fast). I've loaded 5000 records from a CSV in 5 millisec into MySQL on a P500. Indexes might slow the import slightly, but not much and certainly not in 2 seconds unless you have a huge number of fields.

Then use


to recover the unique values. You can also do ORDER BY and other stuff in this case, without jumping through hoops in your code ;-)

Cheers - Neil.

At 16:31 20/10/2003 +0000, you wrote:


I am developing an intranet application which parses a CSV file into a 2-D array, then searches for all unique values (PHP function) of a field and the most recent record in each case (foreach loops).

I have a working routine that takes about 15 seconds to perform this operation. Loading the CSV takes only 2 of those 15 seconds.

Then 13 seconds is taken up processing 5000 records, this seems slow on a 2GHz P4 Windows 2000 with lots of memory.

I was wondering if there is any way to use SQL to retrieve results from a 2-D array [n][fields] ? Or any speed optimisation techniques that might be applicable?
David Nicholls, MCSE, CCEA

-- PHP Database Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to