[PHP] Redirect output to a file on the web server
Greetings List members, I have a script that takes quite a while to run, one or two hours, I wish to redirect the normal php output to a file on the webserver itself. I don't mind if in the process, the browser displays a blank page. The reason I want to do this is that if the script crashes or the browser Is closed by mistake, I have absolutely no record of where the script stopped running. I could use code like below At the beginning of the script: ob_start(); At the end of the script: $page = ob_get_contents(); ob_end_flush(); $fp = fopen(output.html,w); fwrite($fp,$page); fclose($fp); However, I see some problems with this: I'm not too sure of the size of the output. It may balloon to over the buffering limit (in PHP? Apache?) and then what happens? Secondly, if the script crashes before the end, I won't get any output. Finally, I am using a library in the script that outputs status and error messages of its own. So, if I manually opened a file and used fwrite() alongside echo for my messages, I would lose out on those messages. Anybody has any pointers on how you could send the output not only to a browser, but also to a file on the webserver? If not, at least to a file? Thanks and regards, Ferdi
Re: [PHP] Redirect output to a file on the web server
On 6 December 2010 20:47, Steve Staples sstap...@mnsi.net wrote: On Mon, 2010-12-06 at 20:29 +0530, Ferdi wrote: Greetings List members, I have a script that takes quite a while to run, one or two hours, I wish to redirect the normal php output to a file on the webserver itself. I don't mind if in the process, the browser displays a blank page. The reason I want to do this is that if the script crashes or the browser Is closed by mistake, I have absolutely no record of where the script stopped running. I could use code like below At the beginning of the script: ob_start(); At the end of the script: $page = ob_get_contents(); ob_end_flush(); $fp = fopen(output.html,w); fwrite($fp,$page); fclose($fp); However, I see some problems with this: I'm not too sure of the size of the output. It may balloon to over the buffering limit (in PHP? Apache?) and then what happens? Secondly, if the script crashes before the end, I won't get any output. Finally, I am using a library in the script that outputs status and error messages of its own. So, if I manually opened a file and used fwrite() alongside echo for my messages, I would lose out on those messages. Anybody has any pointers on how you could send the output not only to a browser, but also to a file on the webserver? If not, at least to a file? Thanks and regards, Ferdi Just curious, but if it takes that long to run, why are you running it from a browser? why not run it from the commandline, that way you dont have to change your php.ini for the webserver (increasing the timeout, memory limits, etc etc... you can change those for the CLI only? 2 hours is a long time to hope that the browser doesn't close, or connectivity doesn't get interupted for even 1 microsecond... if the script has breaks in it, where it starts to do something else, you can put in an email to yourself, to say hey, we're HERE now but i would look into running it from the CLI over the webserver, you would be less likely to run into issues on something that takes that amount of time to run. If you needed the output to be displayed on a webpage, you can write the progress to a file, and then have a php webpage that reads the file, and using ajax or whatever, refresh the content. good luck in your script, and if you still run it from the browser, and need to output to a file, then i would continually be writing content to that file, every time you do soemthing, or start another part of the script so you know EXACTLY where you are, at all times... Steve Hi Steve, Thanks for the tips. To answer your queries, I don't mind using CLI. How do I then ensure the messages, error or otherwise, output by the library I use, show up in the file I'm outputting to? Please note that I only make calls to the functions and object methods from this library and error or success messages are probably echo'd by the code from the library. I believe some context is in order. I am actually sending an email to a large number of our customers (around 10,000), each with the customer's name. My script does output a simple success or failure message. The library outputs more technical info. All I would like to know is which email got sent and which one failed. As you point out, I could write the success / failure messages to a file, but, I would also like to capture the messages output by the library. Thanks and regards, Ferdi
[PHP] Re: Execute a php page and don't wait for it to finish
On 19 October 2010 18:50, Ferdi ferdinan...@printo.in wrote: Hi List, I have a php page that updates data from one database to another when it is run. My query is, how can I trigger the execution of this update page from another php / javascript without the calling page having to wait for the update page to finish? Basically, I think the update page needs to use: ignore_user_abort(1); set_time_limit(0); // I don't think the script will take more than 1 min. At the other end I found this: 1) http://www.mindraven.com/blog/php/run-a-php-script-in-the-background-using-ajax/ 2) On that page a user suggested using *pclose(popen(‘/usr/bin/php /path/to/something.php /dev/null ’, ‘r’)* **However, I need this to be usable on windows servers also. 3) Finally, would pcntl_exec, pcntl_fork, exec or something be useful for me? Which of the above 3 options is the better one? Other suggestions are welcome :) Hi List, Sorry this took so long, but I wanted to close the loop (and maybe ease some one else's trouble :-)). I didn't think much about the die(header('Location: run_this_even_if_user_aborts.php')) call I was actually using to get this working. When I carefully looked up php.net for header, I realised it was a browser redirect! Now wonder the script would work some times but not always. It's clear now that every time the script worked was because I waited long enough for the browser to be redirected before killing the page. I finally settled on using jquery's ajax calls. Thanks once again to the repliers. Ferdi
[PHP] Execute a php page and don't wait for it to finish
Hi List, I have a php page that updates data from one database to another when it is run. My query is, how can I trigger the execution of this update page from another php / javascript without the calling page having to wait for the update page to finish? Basically, I think the update page needs to use: ignore_user_abort(1); set_time_limit(0); // I don't think the script will take more than 1 min. At the other end I found this: 1) http://www.mindraven.com/blog/php/run-a-php-script-in-the-background-using-ajax/ 2) On that page a user suggested using *pclose(popen(‘/usr/bin/php /path/to/something.php /dev/null ’, ‘r’)* **However, I need this to be usable on windows servers also. 3) Finally, would pcntl_exec, pcntl_fork, exec or something be useful for me? Which of the above 3 options is the better one? Other suggestions are welcome :) Thanks and Regards, Ferdi
Re: [PHP] PHP on command line -- mysql_connect error
On 14 June 2010 21:58, Ashley Sheridan a...@ashleysheridan.co.uk wrote: On Mon, 2010-06-14 at 21:36 +0530, Ferdi wrote: Hi list, My config is XAMPP 1.7.2 with PHP 5.3.0 I'm trying to run a php script as a cron job. The same script works perfectly from the browser, but fails when I try it from the command line (not yet set it up as cron). I get the following error: PHP Fatal error: Call to undefined function mysql_connect() in /opt/lampp/htdocs/CS/weekly_email_report.php on line 3 I tried using dl('mysql.so') before the mysql_connect, but to no avail, I get: PHP Warning: dl(): Unable to load dynamic library '/usr/lib/php/modules/mysql.so' - /usr/lib/php/modules/mysql.so: cannot open shared object file: No such file or directory in /opt/lampp/htdocs/CS/weekly_email_report.php on line 2 PHP Fatal error: Call to undefined function mysql_connect() in /opt/lampp/htdocs/CS/weekly_email_report.php on line 3 In general many functions that work when the page is accessed from the browser, fail on the command line. I tried setting extension=mysql.so in both the php.ini files (the one used by the web server and the one used by the command line (/etc/php.ini, correct??), though not simultaneously) Any pointers?? TIA Ferdi It sounds that maybe you have two different setups of PHP or that the CLI isn't using the same php.ini as the server module. What happens if you run a phpinfo() from the CLI? Do you get the output you expect? You should see the php.ini location as being the same as the server module. If not, you can pass the location of the php.ini in with the command line arguments. If you have two different installations of PHP (which some people tend to do) then you might need to make sure the right modules are installed on both. Try the ini thing first and see if that fixes the issue. Thanks, Ash http://www.ashleysheridan.co.uk Hi List, When I read Ashley's reply, I figured why not use the other php interpreter? The one the web server uses? It worked :-). Thanks Ashley. Just for academic interest, I have put up the output of phpinfo I get with the command line php interpreter at pastebin. This still does not work, but that's immaterial now. Find it at: http://pastebin.com/XL50eBFm Regards, Ferdi
[PHP] PHP on command line -- mysql_connect error
Hi list, My config is XAMPP 1.7.2 with PHP 5.3.0 I'm trying to run a php script as a cron job. The same script works perfectly from the browser, but fails when I try it from the command line (not yet set it up as cron). I get the following error: PHP Fatal error: Call to undefined function mysql_connect() in /opt/lampp/htdocs/CS/weekly_email_report.php on line 3 I tried using dl('mysql.so') before the mysql_connect, but to no avail, I get: PHP Warning: dl(): Unable to load dynamic library '/usr/lib/php/modules/mysql.so' - /usr/lib/php/modules/mysql.so: cannot open shared object file: No such file or directory in /opt/lampp/htdocs/CS/weekly_email_report.php on line 2 PHP Fatal error: Call to undefined function mysql_connect() in /opt/lampp/htdocs/CS/weekly_email_report.php on line 3 In general many functions that work when the page is accessed from the browser, fail on the command line. I tried setting extension=mysql.so in both the php.ini files (the one used by the web server and the one used by the command line (/etc/php.ini, correct??), though not simultaneously) Any pointers?? TIA Ferdi
[PHP] Email from php
Hi List, I have met with little success sending mail from PHP. I have used mainly the mail() function but have also tried imap_mail() which the documentation says is just a wrapper around mail(). Here is my understanding of the situation: On Windows (WampServer 2.0i ) I manage to send email after setting SMTP = smtp.someserver.com. I guess this works because the server I use relays all mail received; it does not check if the user has been registered or not. It obviously does not bother about the password. On Linux (Centos 5.XX, XAMPP 1.7.2), the above function [mail()] does not work. On Linux, what I have understood, is that the smtp server settings in php.ini do not matter. The sendmail / mail (something like that) utility is called which sends the mail. I have put in the correct sendmail_path setting in php.ini, but, I guess this utility is not configured since I don't receive the mail and running the code does not throw up errors or warnings. What I need to achieve is the ability to send attachments in an email from PHP. I would like one of the following options (in order of preference): 1. Create an email account (specifically Google Apps Mail) and send email as that user. This will not be trivial. I will need to get a fix on authenticating, logging in etc. OR 2. Send mail using an SMTP server. Could you also point me to links showing installation / setting up of a SMTP server on Linux? The XAMPP docs say the Mercury mailserver is included, but I couldn't find any help for setting it up. OR 3. Configuring the sendmail utility on Linux Alternaltly, any URLs giving the entire picture are also welcome. So far, I have only found info that is almost verbatim copy of the PHP manual. They just explain the different mail libraries and functions without considering the mail servers and all the back end. Thanks for taking the trouble to read! Regards, Ferdi