> -----Original Message-----
> From: Shawn McKenzie [mailto:[EMAIL PROTECTED]
> Sent: Monday, February 25, 2008 11:19 PM
> To: php-general@lists.php.net
> Subject: [PHP] Re: PHP cuts download process prematurely
> 
> Manuel Barros Reyes wrote:
> > I am building a report application that generates some text files for
> > download and when the download starts it stops prematurely.
> >
> > The file sizes are currently in the order of the mega bytes and when I
> > try the script that generates and sends the file in a test server the
> > process goes smoothly no matter the size of the file but as soon as I
> > move the script to the production server dowloads cut at 300kb aprox.
> > My current workarround is to gzip the files and that is giving me some
> > extra time but the files are growing and sooner or later my
> > workarround will become useless.
> >
> > I guess the download is stoped by some timeout and not because of the
> > amount of kb downloaded because the size varies slightly. If that
> > timeout exists it should be of apox. 5-10 seconds.
> >
> > I use this function to perform the upload $contenido is the content of
> > the file and to that variable I assign the big chunk of output from
> > the report, $nombre_archivo is the optional name for the file. I can
> > paste more code but I think the problem is here.
> >
> > <?php
> > function enviarArchivo($contenido, $nombre_archivo = "") {
> >
> >
> >
> >     if($nombre_archivo == "") {
> >
> >             $nombre_archivo = date("dmyHi").".csv";
> >
> >     }
> >
> >
> >
> >     header("Content-Type: application/octet-stream");
> >
> >     header("Content-Disposition: attachment; filename=$nombre_archivo");
> >
> >     header("Content-Length: ".strlen($contenido));
> >
> >     echo $contenido;
> >
> > }
> >
> > ?>
> >
> > Thanks in advance
> > Manuel
> 
> What does your error log say when this happens?
> 
> -Shawn

Though this is not likely to solve the problem, try adding the following two
lines at the beginning of the script (even before you query the database and do
all your logic)

ignore_user_abort(true);
set_time_limit(0);

If this solves the problem you should read this http://ar2.php.net/info and this
http://ar2.php.net/manual/es/function.ignore-user-abort.php carefully and then
choose more rational settings. PHP has a default execution time of 30s.

However, this is not likely to solve the problem as execution time should not be
affected by streaming and we are assuming the user/browser is not aborting the
connection. On the other hand you could support download resuming (it involves
some tricky headers and if you are willing to dig deeper into this read the
manual notes at http://ar.php.net/manual/es/function.header.php). But this won't
solve your problem... at least for IE as it doesn't support resuming :(.

You need the log files to know exactly what the problem is. And, even if you are
not solving this issue using compression as a workaround, you may also want to
add at the beginning of the script:

ob_start("ob_gzhandler")

Or... you can use "zlib.output_compression" INI setting in an .htaccess file or
in php.ini.

Compressed files will require more server processing but will download way
faster. ob_gzhandler and zlib compression are transparent to the end user, so
they will think they've got the uncompressed file, no need to gzip the files
programmatically.

Regards,

Rob


Andrés Robinet | Lead Developer | BESTPLACE CORPORATION 
5100 Bayview Drive 206, Royal Lauderdale Landings, Fort Lauderdale, FL 33308 |
TEL 954-607-4207 | FAX 954-337-2695 | 
Email: [EMAIL PROTECTED]  | MSN Chat: [EMAIL PROTECTED]  |  SKYPE: bestplace |
 Web: bestplace.biz  | Web: seo-diy.com

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to