[PHP] Reading large files via PHP?
I'm reading in a 66MB file into php so I can parse it and load it into a database. However, it is taking a long time for php to read the file. So long that the script times out. I know about the ini_set(max_execution_time, 120) or whatever length I want to set the time out for the script, but my question is as follows. Is it possible to read in a file and at the same time echo out a status of how far along the file has been read? The code I have right now is below. Just wondering if what I'm trying to do is possible and how to do it. // set script timeout ini_set(max_execution_time, 120); // import file name $log_file_name = access_log.1; echo Reading access log!br/; if (!$ac_arr = file(./$log_file_name)) { echo Couldn't load access log!; die; } Thanks!
Re: [PHP] Reading large files via PHP?
I'm reading in a 66MB file into php so I can parse it and load it into a database. However, it is taking a long time for php to read the file. So long that the script times out. I know about the ini_set(max_execution_time, 120) or whatever length I want to set the time out for the script, but my question is as follows. Is it possible to read in a file and at the same time echo out a status of how far along the file has been read? The code I have right now is below. Just wondering if what I'm trying to do is possible and how to do it. // set script timeout ini_set(max_execution_time, 120); // import file name $log_file_name = access_log.1; echo Reading access log!br/; if (!$ac_arr = file(./$log_file_name)) { echo Couldn't load access log!; die; } Thanks! Even if you get around the execution time the above code is going to use up 66megs of memory to store that file in $ac_arr. You'd be much better off doing something like: $fd = fopen(./$log_file_name, r); while ( !feof($fd) ) { $buf = fread($fd, 32768); // process buffer here } fclose($fd); That will let you do your progress thing as well as not eat up so much memory. Play around with values of 32768 to see if smaller or larger makes any different on your system. Good luck! -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php
RE: [PHP] Reading large files via PHP?
I'm reading in a 66MB file into php so I can parse it and load it into a database. However, it is taking a long time for php to read the file. So long that the script times out. I know about the ini_set(max_execution_time, 120) or whatever length I want to set the time out for the script, but my question is as follows. Is it possible to read in a file and at the same time echo out a status of how far along the file has been read? The code I have right now is below. Just wondering if what I'm trying to do is possible and how to do it. // set script timeout ini_set(max_execution_time, 120); // import file name $log_file_name = access_log.1; echo Reading access log!br/; if (!$ac_arr = file(./$log_file_name)) { echo Couldn't load access log!; die; } Thanks! Even if you get around the execution time the above code is going to use up 66megs of memory to store that file in $ac_arr. You'd be much better off doing something like: $fd = fopen(./$log_file_name, r); while ( !feof($fd) ) { $buf = fread($fd, 32768); // process buffer here } fclose($fd); That will let you do your progress thing as well as not eat up so much memory. Play around with values of 32768 to see if smaller or larger makes any different on your system. Good luck! Ahhh.. Okay that's very cool and I did not know that. I knew that it was chewing up a lot of memory reading it all at once. I guess the only thing I have to worry about now is if I do a read like that will it truncate a whole line instead of getting the whole line? The reason why I ask is that if possible I would rather have it read line by line for X lines and then process that buffer then go and grab more lines until end of file. Is that possible?
RE: [PHP] Reading large files via PHP?
I'm reading in a 66MB file into php so I can parse it and load it into a database. However, it is taking a long time for php to read the file. So long that the script times out. I know about the ini_set(max_execution_time, 120) or whatever length I want to set the time out for the script, but my question is as follows. Is it possible to read in a file and at the same time echo out a status of how far along the file has been read? The code I have right now is below. Just wondering if what I'm trying to do is possible and how to do it. // set script timeout ini_set(max_execution_time, 120); // import file name $log_file_name = access_log.1; echo Reading access log!br/; if (!$ac_arr = file(./$log_file_name)) { echo Couldn't load access log!; die; } Thanks! Even if you get around the execution time the above code is going to use up 66megs of memory to store that file in $ac_arr. You'd be much better off doing something like: $fd = fopen(./$log_file_name, r); while ( !feof($fd) ) { $buf = fread($fd, 32768); // process buffer here } fclose($fd); That will let you do your progress thing as well as not eat up so much memory. Play around with values of 32768 to see if smaller or larger makes any different on your system. Good luck! Ahhh.. Okay that's very cool and I did not know that. I knew that it was chewing up a lot of memory reading it all at once. I guess the only thing I have to worry about now is if I do a read like that will it truncate a whole line instead of getting the whole line? The reason why I ask is that if possible I would rather have it read line by line for X lines and then process that buffer then go and grab more lines until end of file. Is that possible? Sure. If it's a text file you could do something like this: $i = 0; $buf = ; while ( !feof($fd) ) { $buf .= fgets($fd); if ( $i++ % 10 == 0 ) { // process buf here $buf = ; } } Or you could turn buf into an array with $buf[] = fgets($fd) if that's easier... All kinds of options :) -philip -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php