Re: [PHP] Re: PHP cuts download process prematurely

2008-02-26 Thread Manuel Barros Reyes
On Tue, Feb 26, 2008 at 3:07 AM, Andrés Robinet [EMAIL PROTECTED] wrote:

  You need the log files to know exactly what the problem is. And, even if you 
 are

I'll try to rescue those logs many thanks again.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Re: PHP cuts download process prematurely

2008-02-26 Thread Manuel Barros Reyes
On Tue, Feb 26, 2008 at 3:07 AM, Andrés Robinet [EMAIL PROTECTED] wrote:

  Though this is not likely to solve the problem, try adding the following two
  lines at the beginning of the script (even before you query the database and 
 do
  all your logic)

  ignore_user_abort(true);
  set_time_limit(0);

  You need the log files to know exactly what the problem is. And, even if you 
 are
  not solving this issue using compression as a workaround, you may also want 
 to
  add at the beginning of the script:

  ob_start(ob_gzhandler)


From what I read I think ob_start(ob_gzhandler) would be needed if I
would like to compress output transparently and let the clients
browser do the reverse job in the same maner, but in this case I am
uploading the files as .gz and the person who downloads it takes care
of decompressing it by hand.

Maybe I should use a simple ob_start() in this case. I understand
ob_start()-ob_end_flush() collects data in a server buffer and outputs
it all together but in the case of my script the only output is the
file I am uploading and the output is only in th echo $contents.
Unless I am missing details of ob_start() wouldn't this be equivalent
in this case?

Thanks

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] Re: PHP cuts download process prematurely

2008-02-26 Thread Wolf
What you really need to do is grab the phpinfo from both servers and see what 
the differences are.  Either put them side-by-side on the screen, or print them 
both out; but either way you need to do a line-by-line, 
configuration-by-configuration check to see what/where the differences in the 
setup are.

Your sys-admin should have made the test servers EXACTLY like the production 
servers so that working with them would be seemless, but not all 
programming/test/production environments are ideal.

Wolf
 Manuel Barros Reyes [EMAIL PROTECTED] wrote: 
 On Tue, Feb 26, 2008 at 3:07 AM, Andrés Robinet [EMAIL PROTECTED] wrote:
 
   Though this is not likely to solve the problem, try adding the following 
  two
   lines at the beginning of the script (even before you query the database 
  and do
   all your logic)
 
   ignore_user_abort(true);
   set_time_limit(0);
 
   You need the log files to know exactly what the problem is. And, even if 
  you are
   not solving this issue using compression as a workaround, you may also 
  want to
   add at the beginning of the script:
 
   ob_start(ob_gzhandler)
 
 
 From what I read I think ob_start(ob_gzhandler) would be needed if I
 would like to compress output transparently and let the clients
 browser do the reverse job in the same maner, but in this case I am
 uploading the files as .gz and the person who downloads it takes care
 of decompressing it by hand.
 
 Maybe I should use a simple ob_start() in this case. I understand
 ob_start()-ob_end_flush() collects data in a server buffer and outputs
 it all together but in the case of my script the only output is the
 file I am uploading and the output is only in th echo $contents.
 Unless I am missing details of ob_start() wouldn't this be equivalent
 in this case?
 
 Thanks
 
 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] Re: PHP cuts download process prematurely

2008-02-25 Thread Andrés Robinet
 -Original Message-
 From: Shawn McKenzie [mailto:[EMAIL PROTECTED]
 Sent: Monday, February 25, 2008 11:19 PM
 To: php-general@lists.php.net
 Subject: [PHP] Re: PHP cuts download process prematurely
 
 Manuel Barros Reyes wrote:
  I am building a report application that generates some text files for
  download and when the download starts it stops prematurely.
 
  The file sizes are currently in the order of the mega bytes and when I
  try the script that generates and sends the file in a test server the
  process goes smoothly no matter the size of the file but as soon as I
  move the script to the production server dowloads cut at 300kb aprox.
  My current workarround is to gzip the files and that is giving me some
  extra time but the files are growing and sooner or later my
  workarround will become useless.
 
  I guess the download is stoped by some timeout and not because of the
  amount of kb downloaded because the size varies slightly. If that
  timeout exists it should be of apox. 5-10 seconds.
 
  I use this function to perform the upload $contenido is the content of
  the file and to that variable I assign the big chunk of output from
  the report, $nombre_archivo is the optional name for the file. I can
  paste more code but I think the problem is here.
 
  ?php
  function enviarArchivo($contenido, $nombre_archivo = ) {
 
 
 
  if($nombre_archivo == ) {
 
  $nombre_archivo = date(dmyHi)..csv;
 
  }
 
 
 
  header(Content-Type: application/octet-stream);
 
  header(Content-Disposition: attachment; filename=$nombre_archivo);
 
  header(Content-Length: .strlen($contenido));
 
  echo $contenido;
 
  }
 
  ?
 
  Thanks in advance
  Manuel
 
 What does your error log say when this happens?
 
 -Shawn

Though this is not likely to solve the problem, try adding the following two
lines at the beginning of the script (even before you query the database and do
all your logic)

ignore_user_abort(true);
set_time_limit(0);

If this solves the problem you should read this http://ar2.php.net/info and this
http://ar2.php.net/manual/es/function.ignore-user-abort.php carefully and then
choose more rational settings. PHP has a default execution time of 30s.

However, this is not likely to solve the problem as execution time should not be
affected by streaming and we are assuming the user/browser is not aborting the
connection. On the other hand you could support download resuming (it involves
some tricky headers and if you are willing to dig deeper into this read the
manual notes at http://ar.php.net/manual/es/function.header.php). But this won't
solve your problem... at least for IE as it doesn't support resuming :(.

You need the log files to know exactly what the problem is. And, even if you are
not solving this issue using compression as a workaround, you may also want to
add at the beginning of the script:

ob_start(ob_gzhandler)

Or... you can use zlib.output_compression INI setting in an .htaccess file or
in php.ini.

Compressed files will require more server processing but will download way
faster. ob_gzhandler and zlib compression are transparent to the end user, so
they will think they've got the uncompressed file, no need to gzip the files
programmatically.

Regards,

Rob


Andrés Robinet | Lead Developer | BESTPLACE CORPORATION 
5100 Bayview Drive 206, Royal Lauderdale Landings, Fort Lauderdale, FL 33308 |
TEL 954-607-4207 | FAX 954-337-2695 | 
Email: [EMAIL PROTECTED]  | MSN Chat: [EMAIL PROTECTED]  |  SKYPE: bestplace |
 Web: bestplace.biz  | Web: seo-diy.com

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php