On Tue, 2004-02-24 at 12:39, Pablo Gosse wrote:
> I've got a log file that's about 1.2 gig that I need to parse.
> 
> Can PHP handle this or am I better of breaking this down into 12 100mb
> chunks and processing it?

PHP has a very rich set of functions to read in files, it all depends on
what you are doing with the log file and how you read it in.  For
example, if you use the file[1] function it will read the entire log
file into memory, that is probably a bad thing for what you are trying
to do.  However, if you need to access many different parts of the file
frequently and you have the ram go for it.  On the other side of the
spectrum are the C-like functions: fopen[2], fgets[3], fread[4],
fclose[5], et. al.  These will give you a high level of control over how
much of the file you read in at a time.  Lastly, the stream[6] functions
may be a good middle ground for you.

Regards,
Adam

[1] http://www.php.net/file
[2] http://www.php.net/fopen
[3] http://www.php.net/fgets
[4] http://www.php.net/fread
[5] http://www.php.net/fclose
[6] http://www.php.net/stream

-- 
Adam Bregenzer
[EMAIL PROTECTED]
http://adam.bregenzer.net/

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to