Hola--
A few times I've needed to load a very large text file and explode the
newlines. I didn't want to do filesize() to get the file in and cut off
parts of strings, only to try and put them together with the next data
input. After some searching, I found the following, which may help some of
you out there:
ini_set('memory_limit', '1000M'); // Large file to process
# Stream into PHP from STDIN
do{
do{
$selection=fgets(STDIN);
}while(trim($selection) == '');
$selection = str_replace('\'', '', $selection);
$inputArray[] = $selection;
if(count($inputArray) == 8000){
takeMyString($inputArray);
$inputArray = array();
$totalcount+=8000;
sleep(2);
}
} while ($selection2 != 'q');
exit(0);
this takes in each line from STDIN, adds it to an array, and when the array
hits 8000 (my memory limit at the time) it sends the array to a PHP function
that will process and input it into the DB I am using.
Just in case anyone needs to process large files, stream them in
# more file.log | php process.php
--
Take care,
William Attwood
Idea Extraordinaire
[email protected]
Samuel Goldwyn<http://www.brainyquote.com/quotes/authors/s/samuel_goldwyn.html>
- "I'm willing to admit that I may not always be right, but I am never
wrong."
/*
PLUG: http://plug.org, #utah on irc.freenode.net
Unsubscribe: http://plug.org/mailman/options/plug
Don't fear the penguin.
*/