Re: [PHP] PHPExcel with large files (27,000+ rows)

2010-10-04 Thread Per Jessen
chris h wrote:

 I'm currently working on a project that requires the parsing of excel
 files.  Basically the user uploads an excel file, and then a script
 needs to save a row in a Postgres database for each row in the excel
 file.  The issue we are having is that when we task PHPExcel with
 parsing an excel file with, say 27k rows, it explodes with a memory
 error.  I've read up on the  PHPExcel forums and we've tried cell
 caching as well as ReadDataOnly, they do not seem to be sufficient.
 
 Does anyone here know of a way to do this? Surely there is a way to
 parse a large excel file with PHP.  

If your excel file is or can be transformed to XML, I would just use
XSLT.  No PHP needed. 



-- 
Per Jessen, Zürich (19.1°C)


--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] PHPExcel with large files (27,000+ rows)

2010-10-04 Thread Marc Guay
I use this: http://code.google.com/p/php-csv-parser/

No idea if it's any better than your current solution.  I presume
you've tried extending PHP's memory limit?

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] PHPExcel with large files (27,000+ rows)

2010-10-04 Thread shiplu
On Tue, Oct 5, 2010 at 12:39 AM, chris h chris...@gmail.com wrote:
 I'm currently working on a project that requires the parsing of excel files.
  Basically the user uploads an excel file, and then a script needs to save a
 row in a Postgres database for each row in the excel file.  The issue we are
 having is that when we task PHPExcel with parsing an excel file with, say
 27k rows, it explodes with a memory error.  I've read up on the PHPExcel
 forums and we've tried cell caching as well as ReadDataOnly, they do not
 seem to be sufficient.

 Does anyone here know of a way to do this? Surely there is a way to parse a
 large excel file with PHP.  This is also NOT an on-demand service.  That is,
 when someone uploads a file they get a task_id which allows them to check
 the status of their excel file.  So the solution does not need to be a fast
 one!


 Thanks,
 Chris.


1. Remove any variable that contains big object if its not necessary.
2. Use unset when applicable
3. Read chunk by chunk.
4. Profile it to find the exact place where you are wasting memory.
Optimizing that little portion of code can improve memory performance.

-- 
Shiplu Mokadd.im
My talks, http://talk.cmyweb.net
Follow me, http://twitter.com/shiplu
SUST Programmers, http://groups.google.com/group/p2psust
Innovation distinguishes bet ... ... (ask Steve Jobs the rest)

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] PHPExcel with large files (27,000+ rows)

2010-10-04 Thread chris h
Thanks Jessen/Marc, though the user provided format can be in xls, xlsx, or
csv.  So i need a solution to support all formats.

Thanks for the ideas shiplu I'll get with the team and see if there's
anything there we aren't trying.


Chris.


On Mon, Oct 4, 2010 at 3:01 PM, shiplu shiplu@gmail.com wrote:

 On Tue, Oct 5, 2010 at 12:39 AM, chris h chris...@gmail.com wrote:
  I'm currently working on a project that requires the parsing of excel
 files.
   Basically the user uploads an excel file, and then a script needs to
 save a
  row in a Postgres database for each row in the excel file.  The issue we
 are
  having is that when we task PHPExcel with parsing an excel file with, say
  27k rows, it explodes with a memory error.  I've read up on the PHPExcel
  forums and we've tried cell caching as well as ReadDataOnly, they do not
  seem to be sufficient.
 
  Does anyone here know of a way to do this? Surely there is a way to parse
 a
  large excel file with PHP.  This is also NOT an on-demand service.  That
 is,
  when someone uploads a file they get a task_id which allows them to check
  the status of their excel file.  So the solution does not need to be a
 fast
  one!
 
 
  Thanks,
  Chris.
 

 1. Remove any variable that contains big object if its not necessary.
 2. Use unset when applicable
 3. Read chunk by chunk.
 4. Profile it to find the exact place where you are wasting memory.
 Optimizing that little portion of code can improve memory performance.

 --
 Shiplu Mokadd.im
 My talks, http://talk.cmyweb.net
 Follow me, http://twitter.com/shiplu
 SUST Programmers, http://groups.google.com/group/p2psust
 Innovation distinguishes bet ... ... (ask Steve Jobs the rest)