Re: [PHP] how to provide download of files mow in documentroot

2010-03-31 Thread Anshul Agrawal
On Wed, Mar 31, 2010 at 1:12 AM, Nathan Rixham nrix...@gmail.com wrote:

 Anshul Agrawal wrote:
  On Tue, Mar 30, 2010 at 8:41 PM, Jan G.B. ro0ot.w...@googlemail.com
 wrote:
 
  2010/3/30 Nathan Rixham nrix...@gmail.com:
  Jan G.B. wrote:
  2010/3/29 Nathan Rixham nrix...@gmail.com
 
  Jan G.B. wrote:
  2010/3/29 Nathan Rixham nrix...@gmail.com
 
  Jan G.B. wrote:
  Top posting sucks, so I'll answer the post somewhere down there.
  SCNR
 
  2010/3/29 Devendra Jadhav devendra...@gmail.com
 
  Then you can do file_get_contents within PHP. or any file
 handling
  mechanism.
  On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
  Hi
  i am writing a web application in php
  this webapp primarily focuses on file uploads and downloads
  the uploaded files will be saved in a folder which is not in
  document
  root
  and my query is how will i be able to provide download to such
  files
  not
  located in document root via php
 
  Try something like that
  ?php
  $content = file_get_contents($filename);
  $etag = md5($content);
  header('Last-Modified: '.gmdate('D, d M Y H:i:s',
  filemtime($filename)).' GMT');
  header('ETag: '.$etag);
  header('Accept-Ranges: bytes');
  header('Content-Length: '.strlen($content));
  header('Cache-Control: '.$cache_value); // you decide
  header('Content-type: '.$should_be_set);
  echo $content;
  exit;
  ?
 
  Depending on the $filesize, you should use something else than
  file_get_contents() (for example fopen/fread). file_get_contents
 on
  a
  huge
  file will exhaust your webservers RAM.
  Yup, so you can map the Directory /path/to in web server config;
  then
  allow from only from localhost + yourdomain. This means you can
  then
  request it like an url and do a head request to get the etag etc
 then
  return a 304 not modified if you received a matching etag
  Last-Modified
  etc; (thus meaning you only file_get_contents when really really
  needed).
  I'd advise against saying you Accept-Ranges bytes if you don't
 accept
  byte ranges (ie you aren't going to send little bits of the file).
 
  If you need the downloads to be secure only; then you could easily
  negate php all together and simply expose the directory via a
  location
  so that it is web accessible and set it up to ask for auth using
  htpasswd; a custom script, ldap or whatever.
 
  And if you don't need security then why have php involved at all?
  simply
  symlink to the directory or expose it via http and be done with the
  problem in a minute or two.
 
  Regards!
 
  In my opinion, serving user-content on a productive server is wicked
  sick.
  You don't want your visitors to upload malicous files that may
 trigger
  some
  modules as mod_php in apache. So it makes sense to store
 user-uploads
  outside of a docroot and with no symlink or whatsover.
  even the simplest of server configurations will ensure safety. just
 use
  .htaccess to SetHandler default-handler which treats everything as
  static content and serves it right up.
 
  Yes. But the average persons posting here aren't server config gods, I
  believe.
  Also, you can not implement permissions on these files.
  The discussion was about serving files from a place outside any
 docroot!
  Guess there is a reason for that.
 
 
  One more thing added: your RAM will be exhausted even if you open
 that
  600mb
  file just once.
  Apaches memory handling is a bit weird: if *one* apache process is
  using
  200mb RAM on *one* impression because your application uses that
 much,
  then
  that process will not release the memory while it's serving another
  1000
  requests for `clear.gif` which is maybe 850b in size.
  again everything depends on how you have your server configured; you
  can
  easily tell apache to kill each child after one run or a whole host
 of
  other configs; but ultimately if you can avoid opening up that file
 in
  php then do; serving statically as above is the cleanest quickest way
  to
  do it (other than using s3 or similar).
 
  regards!
 
  Sure, you could configure your apache like that. Unless you have some
  traffic on your site, because the time intensive thing for apache is
 to
  spawn new processes. So it's just not a good idea to do that, Nor to
  serve
  big files via file_get_contents.
  was only addressing and issue you pointed out.. anyways.. so you
 propose
  what exactly? don't server via apache, don't use file_get_contents
  instead do..?
 
  ps you do realise that virtually every huge file on the net is served
  via a web server w/o problems yeah?
 
 
  I was recommending other file methods like fopen() combinations,
  fpassthru() and at best readfile(). All of them do not buffer the
  whole file in memory.
 
  http://php.net/readfile
  http://php.net/fpassthru
 
  Regards
 
 
 
  --
  PHP General Mailing List (http://www.php.net/)
  To unsubscribe, visit: http://www.php.net/unsub.php
 
 
  I wanted to see the 

Re: [PHP] how to provide download of files mow in documentroot

2010-03-31 Thread Tommy Pham
On Wed, Mar 31, 2010 at 12:43 AM, Anshul Agrawal drinknder...@gmail.com wrote:
 On Wed, Mar 31, 2010 at 1:12 AM, Nathan Rixham nrix...@gmail.com wrote:

 Anshul Agrawal wrote:
  On Tue, Mar 30, 2010 at 8:41 PM, Jan G.B. ro0ot.w...@googlemail.com
 wrote:
 
  2010/3/30 Nathan Rixham nrix...@gmail.com:
  Jan G.B. wrote:
  2010/3/29 Nathan Rixham nrix...@gmail.com
 
  Jan G.B. wrote:
  2010/3/29 Nathan Rixham nrix...@gmail.com
 
  Jan G.B. wrote:
  Top posting sucks, so I'll answer the post somewhere down there.
  SCNR
 
  2010/3/29 Devendra Jadhav devendra...@gmail.com
 
  Then you can do file_get_contents within PHP. or any file
 handling
  mechanism.
  On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
  Hi
  i am writing a web application in php
  this webapp primarily focuses on file uploads and downloads
  the uploaded files will be saved in a folder which is not in
  document
  root
  and my query is how will i be able to provide download to such
  files
  not
  located in document root via php
 
  Try something like that
  ?php
          $content = file_get_contents($filename);
          $etag = md5($content);
          header('Last-Modified: '.gmdate('D, d M Y H:i:s',
  filemtime($filename)).' GMT');
          header('ETag: '.$etag);
          header('Accept-Ranges: bytes');
          header('Content-Length: '.strlen($content));
          header('Cache-Control: '.$cache_value); // you decide
          header('Content-type: '.$should_be_set);
          echo $content;
          exit;
  ?
 
  Depending on the $filesize, you should use something else than
  file_get_contents() (for example fopen/fread). file_get_contents
 on
  a
  huge
  file will exhaust your webservers RAM.
  Yup, so you can map the Directory /path/to in web server config;
  then
  allow from only from localhost + yourdomain. This means you can
  then
  request it like an url and do a head request to get the etag etc
 then
  return a 304 not modified if you received a matching etag
  Last-Modified
  etc; (thus meaning you only file_get_contents when really really
  needed).
  I'd advise against saying you Accept-Ranges bytes if you don't
 accept
  byte ranges (ie you aren't going to send little bits of the file).
 
  If you need the downloads to be secure only; then you could easily
  negate php all together and simply expose the directory via a
  location
  so that it is web accessible and set it up to ask for auth using
  htpasswd; a custom script, ldap or whatever.
 
  And if you don't need security then why have php involved at all?
  simply
  symlink to the directory or expose it via http and be done with the
  problem in a minute or two.
 
  Regards!
 
  In my opinion, serving user-content on a productive server is wicked
  sick.
  You don't want your visitors to upload malicous files that may
 trigger
  some
  modules as mod_php in apache. So it makes sense to store
 user-uploads
  outside of a docroot and with no symlink or whatsover.
  even the simplest of server configurations will ensure safety. just
 use
  .htaccess to SetHandler default-handler which treats everything as
  static content and serves it right up.
 
  Yes. But the average persons posting here aren't server config gods, I
  believe.
  Also, you can not implement permissions on these files.
  The discussion was about serving files from a place outside any
 docroot!
  Guess there is a reason for that.
 
 
  One more thing added: your RAM will be exhausted even if you open
 that
  600mb
  file just once.
  Apaches memory handling is a bit weird: if *one* apache process is
  using
  200mb RAM on *one* impression because your application uses that
 much,
  then
  that process will not release the memory while it's serving another
  1000
  requests for `clear.gif` which is maybe 850b in size.
  again everything depends on how you have your server configured; you
  can
  easily tell apache to kill each child after one run or a whole host
 of
  other configs; but ultimately if you can avoid opening up that file
 in
  php then do; serving statically as above is the cleanest quickest way
  to
  do it (other than using s3 or similar).
 
  regards!
 
  Sure, you could configure your apache like that. Unless you have some
  traffic on your site, because the time intensive thing for apache is
 to
  spawn new processes. So it's just not a good idea to do that, Nor to
  serve
  big files via file_get_contents.
  was only addressing and issue you pointed out.. anyways.. so you
 propose
  what exactly? don't server via apache, don't use file_get_contents
  instead do..?
 
  ps you do realise that virtually every huge file on the net is served
  via a web server w/o problems yeah?
 
 
  I was recommending other file methods like fopen() combinations,
  fpassthru() and at best readfile(). All of them do not buffer the
  whole file in memory.
 
  http://php.net/readfile
  http://php.net/fpassthru
 
  Regards
 
 
 
  --
  PHP General Mailing List (http://www.php.net/)
  

Re: [PHP] how to provide download of files mow in documentroot

2010-03-30 Thread Jan G.B.
2010/3/29 Nathan Rixham nrix...@gmail.com

 Jan G.B. wrote:
  2010/3/29 Nathan Rixham nrix...@gmail.com
 
  Jan G.B. wrote:
  Top posting sucks, so I'll answer the post somewhere down there.
  SCNR
 
  2010/3/29 Devendra Jadhav devendra...@gmail.com
 
  Then you can do file_get_contents within PHP. or any file handling
  mechanism.
  On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
  Hi
  i am writing a web application in php
  this webapp primarily focuses on file uploads and downloads
  the uploaded files will be saved in a folder which is not in
 document
  root
  and my query is how will i be able to provide download to such
 files
  not
  located in document root via php
 
  Try something like that
  ?php
  $content = file_get_contents($filename);
  $etag = md5($content);
  header('Last-Modified: '.gmdate('D, d M Y H:i:s',
  filemtime($filename)).' GMT');
  header('ETag: '.$etag);
  header('Accept-Ranges: bytes');
  header('Content-Length: '.strlen($content));
  header('Cache-Control: '.$cache_value); // you decide
  header('Content-type: '.$should_be_set);
  echo $content;
  exit;
  ?
 
  Depending on the $filesize, you should use something else than
  file_get_contents() (for example fopen/fread). file_get_contents on a
  huge
  file will exhaust your webservers RAM.
  Yup, so you can map the Directory /path/to in web server config; then
  allow from only from localhost + yourdomain. This means you can then
  request it like an url and do a head request to get the etag etc then
  return a 304 not modified if you received a matching etag Last-Modified
  etc; (thus meaning you only file_get_contents when really really
 needed).
 
  I'd advise against saying you Accept-Ranges bytes if you don't accept
  byte ranges (ie you aren't going to send little bits of the file).
 
  If you need the downloads to be secure only; then you could easily
  negate php all together and simply expose the directory via a location
  so that it is web accessible and set it up to ask for auth using
  htpasswd; a custom script, ldap or whatever.
 
  And if you don't need security then why have php involved at all? simply
  symlink to the directory or expose it via http and be done with the
  problem in a minute or two.
 
  Regards!
 
 
  In my opinion, serving user-content on a productive server is wicked
 sick.
  You don't want your visitors to upload malicous files that may trigger
 some
  modules as mod_php in apache. So it makes sense to store user-uploads
  outside of a docroot and with no symlink or whatsover.

 even the simplest of server configurations will ensure safety. just use
 .htaccess to SetHandler default-handler which treats everything as
 static content and serves it right up.


Yes. But the average persons posting here aren't server config gods, I
believe.
Also, you can not implement permissions on these files.
The discussion was about serving files from a place outside any docroot!
Guess there is a reason for that.



  One more thing added: your RAM will be exhausted even if you open that
 600mb
  file just once.
  Apaches memory handling is a bit weird: if *one* apache process is using
  200mb RAM on *one* impression because your application uses that much,
 then
  that process will not release the memory while it's serving another 1000
  requests for `clear.gif` which is maybe 850b in size.

 again everything depends on how you have your server configured; you can
 easily tell apache to kill each child after one run or a whole host of
 other configs; but ultimately if you can avoid opening up that file in
 php then do; serving statically as above is the cleanest quickest way to
 do it (other than using s3 or similar).

 regards!


Sure, you could configure your apache like that. Unless you have some
traffic on your site, because the time intensive thing for apache is to
spawn new processes. So it's just not a good idea to do that, Nor to serve
big files via file_get_contents.

Regards


Re: [PHP] how to provide download of files mow in documentroot

2010-03-30 Thread Nathan Rixham
Jan G.B. wrote:
 2010/3/29 Nathan Rixham nrix...@gmail.com
 
 Jan G.B. wrote:
 2010/3/29 Nathan Rixham nrix...@gmail.com

 Jan G.B. wrote:
 Top posting sucks, so I'll answer the post somewhere down there.
 SCNR

 2010/3/29 Devendra Jadhav devendra...@gmail.com

 Then you can do file_get_contents within PHP. or any file handling
 mechanism.
 On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
 Hi
 i am writing a web application in php
 this webapp primarily focuses on file uploads and downloads
 the uploaded files will be saved in a folder which is not in
 document
 root
 and my query is how will i be able to provide download to such
 files
 not
 located in document root via php

 Try something like that
 ?php
 $content = file_get_contents($filename);
 $etag = md5($content);
 header('Last-Modified: '.gmdate('D, d M Y H:i:s',
 filemtime($filename)).' GMT');
 header('ETag: '.$etag);
 header('Accept-Ranges: bytes');
 header('Content-Length: '.strlen($content));
 header('Cache-Control: '.$cache_value); // you decide
 header('Content-type: '.$should_be_set);
 echo $content;
 exit;
 ?

 Depending on the $filesize, you should use something else than
 file_get_contents() (for example fopen/fread). file_get_contents on a
 huge
 file will exhaust your webservers RAM.
 Yup, so you can map the Directory /path/to in web server config; then
 allow from only from localhost + yourdomain. This means you can then
 request it like an url and do a head request to get the etag etc then
 return a 304 not modified if you received a matching etag Last-Modified
 etc; (thus meaning you only file_get_contents when really really
 needed).
 I'd advise against saying you Accept-Ranges bytes if you don't accept
 byte ranges (ie you aren't going to send little bits of the file).

 If you need the downloads to be secure only; then you could easily
 negate php all together and simply expose the directory via a location
 so that it is web accessible and set it up to ask for auth using
 htpasswd; a custom script, ldap or whatever.

 And if you don't need security then why have php involved at all? simply
 symlink to the directory or expose it via http and be done with the
 problem in a minute or two.

 Regards!

 In my opinion, serving user-content on a productive server is wicked
 sick.
 You don't want your visitors to upload malicous files that may trigger
 some
 modules as mod_php in apache. So it makes sense to store user-uploads
 outside of a docroot and with no symlink or whatsover.
 even the simplest of server configurations will ensure safety. just use
 .htaccess to SetHandler default-handler which treats everything as
 static content and serves it right up.

 
 Yes. But the average persons posting here aren't server config gods, I
 believe.
 Also, you can not implement permissions on these files.
 The discussion was about serving files from a place outside any docroot!
 Guess there is a reason for that.
 
 
 One more thing added: your RAM will be exhausted even if you open that
 600mb
 file just once.
 Apaches memory handling is a bit weird: if *one* apache process is using
 200mb RAM on *one* impression because your application uses that much,
 then
 that process will not release the memory while it's serving another 1000
 requests for `clear.gif` which is maybe 850b in size.
 again everything depends on how you have your server configured; you can
 easily tell apache to kill each child after one run or a whole host of
 other configs; but ultimately if you can avoid opening up that file in
 php then do; serving statically as above is the cleanest quickest way to
 do it (other than using s3 or similar).

 regards!

 
 Sure, you could configure your apache like that. Unless you have some
 traffic on your site, because the time intensive thing for apache is to
 spawn new processes. So it's just not a good idea to do that, Nor to serve
 big files via file_get_contents.

was only addressing and issue you pointed out.. anyways.. so you propose
what exactly? don't server via apache, don't use file_get_contents
instead do..?

ps you do realise that virtually every huge file on the net is served
via a web server w/o problems yeah?


-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] how to provide download of files mow in documentroot

2010-03-30 Thread Jan G.B.
2010/3/30 Nathan Rixham nrix...@gmail.com:
 Jan G.B. wrote:
 2010/3/29 Nathan Rixham nrix...@gmail.com

 Jan G.B. wrote:
 2010/3/29 Nathan Rixham nrix...@gmail.com

 Jan G.B. wrote:
 Top posting sucks, so I'll answer the post somewhere down there.
 SCNR

 2010/3/29 Devendra Jadhav devendra...@gmail.com

 Then you can do file_get_contents within PHP. or any file handling
 mechanism.
 On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
 Hi
 i am writing a web application in php
 this webapp primarily focuses on file uploads and downloads
 the uploaded files will be saved in a folder which is not in
 document
 root
 and my query is how will i be able to provide download to such
 files
 not
 located in document root via php

 Try something like that
 ?php
         $content = file_get_contents($filename);
         $etag = md5($content);
         header('Last-Modified: '.gmdate('D, d M Y H:i:s',
 filemtime($filename)).' GMT');
         header('ETag: '.$etag);
         header('Accept-Ranges: bytes');
         header('Content-Length: '.strlen($content));
         header('Cache-Control: '.$cache_value); // you decide
         header('Content-type: '.$should_be_set);
         echo $content;
         exit;
 ?

 Depending on the $filesize, you should use something else than
 file_get_contents() (for example fopen/fread). file_get_contents on a
 huge
 file will exhaust your webservers RAM.
 Yup, so you can map the Directory /path/to in web server config; then
 allow from only from localhost + yourdomain. This means you can then
 request it like an url and do a head request to get the etag etc then
 return a 304 not modified if you received a matching etag Last-Modified
 etc; (thus meaning you only file_get_contents when really really
 needed).
 I'd advise against saying you Accept-Ranges bytes if you don't accept
 byte ranges (ie you aren't going to send little bits of the file).

 If you need the downloads to be secure only; then you could easily
 negate php all together and simply expose the directory via a location
 so that it is web accessible and set it up to ask for auth using
 htpasswd; a custom script, ldap or whatever.

 And if you don't need security then why have php involved at all? simply
 symlink to the directory or expose it via http and be done with the
 problem in a minute or two.

 Regards!

 In my opinion, serving user-content on a productive server is wicked
 sick.
 You don't want your visitors to upload malicous files that may trigger
 some
 modules as mod_php in apache. So it makes sense to store user-uploads
 outside of a docroot and with no symlink or whatsover.
 even the simplest of server configurations will ensure safety. just use
 .htaccess to SetHandler default-handler which treats everything as
 static content and serves it right up.


 Yes. But the average persons posting here aren't server config gods, I
 believe.
 Also, you can not implement permissions on these files.
 The discussion was about serving files from a place outside any docroot!
 Guess there is a reason for that.


 One more thing added: your RAM will be exhausted even if you open that
 600mb
 file just once.
 Apaches memory handling is a bit weird: if *one* apache process is using
 200mb RAM on *one* impression because your application uses that much,
 then
 that process will not release the memory while it's serving another 1000
 requests for `clear.gif` which is maybe 850b in size.
 again everything depends on how you have your server configured; you can
 easily tell apache to kill each child after one run or a whole host of
 other configs; but ultimately if you can avoid opening up that file in
 php then do; serving statically as above is the cleanest quickest way to
 do it (other than using s3 or similar).

 regards!


 Sure, you could configure your apache like that. Unless you have some
 traffic on your site, because the time intensive thing for apache is to
 spawn new processes. So it's just not a good idea to do that, Nor to serve
 big files via file_get_contents.

 was only addressing and issue you pointed out.. anyways.. so you propose
 what exactly? don't server via apache, don't use file_get_contents
 instead do..?

 ps you do realise that virtually every huge file on the net is served
 via a web server w/o problems yeah?



I was recommending other file methods like fopen() combinations,
fpassthru() and at best readfile(). All of them do not buffer the
whole file in memory.

http://php.net/readfile
http://php.net/fpassthru

Regards

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] how to provide download of files mow in documentroot

2010-03-30 Thread Nathan Rixham
Jan G.B. wrote:
 I was recommending other file methods like fopen() combinations,
 fpassthru() and at best readfile(). All of them do not buffer the
 whole file in memory.
 
 http://php.net/readfile
 http://php.net/fpassthru

ahh so you were; completely missed that, apologies - readfile's the one
and good advice.

still keen to point out that if you don't need any other features from
php then why use php when webserver will do the job perfectly well -
primary reason for me mentioning this is to take advantage of the cache
control / etag / last modified etc (most php scripts just return 200 ok
repeatedly)

regards

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] how to provide download of files mow in documentroot

2010-03-30 Thread Anshul Agrawal
On Tue, Mar 30, 2010 at 8:41 PM, Jan G.B. ro0ot.w...@googlemail.com wrote:

 2010/3/30 Nathan Rixham nrix...@gmail.com:
  Jan G.B. wrote:
  2010/3/29 Nathan Rixham nrix...@gmail.com
 
  Jan G.B. wrote:
  2010/3/29 Nathan Rixham nrix...@gmail.com
 
  Jan G.B. wrote:
  Top posting sucks, so I'll answer the post somewhere down there.
  SCNR
 
  2010/3/29 Devendra Jadhav devendra...@gmail.com
 
  Then you can do file_get_contents within PHP. or any file handling
  mechanism.
  On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
  Hi
  i am writing a web application in php
  this webapp primarily focuses on file uploads and downloads
  the uploaded files will be saved in a folder which is not in
  document
  root
  and my query is how will i be able to provide download to such
  files
  not
  located in document root via php
 
  Try something like that
  ?php
  $content = file_get_contents($filename);
  $etag = md5($content);
  header('Last-Modified: '.gmdate('D, d M Y H:i:s',
  filemtime($filename)).' GMT');
  header('ETag: '.$etag);
  header('Accept-Ranges: bytes');
  header('Content-Length: '.strlen($content));
  header('Cache-Control: '.$cache_value); // you decide
  header('Content-type: '.$should_be_set);
  echo $content;
  exit;
  ?
 
  Depending on the $filesize, you should use something else than
  file_get_contents() (for example fopen/fread). file_get_contents on
 a
  huge
  file will exhaust your webservers RAM.
  Yup, so you can map the Directory /path/to in web server config;
 then
  allow from only from localhost + yourdomain. This means you can
 then
  request it like an url and do a head request to get the etag etc then
  return a 304 not modified if you received a matching etag
 Last-Modified
  etc; (thus meaning you only file_get_contents when really really
  needed).
  I'd advise against saying you Accept-Ranges bytes if you don't accept
  byte ranges (ie you aren't going to send little bits of the file).
 
  If you need the downloads to be secure only; then you could easily
  negate php all together and simply expose the directory via a
 location
  so that it is web accessible and set it up to ask for auth using
  htpasswd; a custom script, ldap or whatever.
 
  And if you don't need security then why have php involved at all?
 simply
  symlink to the directory or expose it via http and be done with the
  problem in a minute or two.
 
  Regards!
 
  In my opinion, serving user-content on a productive server is wicked
  sick.
  You don't want your visitors to upload malicous files that may trigger
  some
  modules as mod_php in apache. So it makes sense to store user-uploads
  outside of a docroot and with no symlink or whatsover.
  even the simplest of server configurations will ensure safety. just use
  .htaccess to SetHandler default-handler which treats everything as
  static content and serves it right up.
 
 
  Yes. But the average persons posting here aren't server config gods, I
  believe.
  Also, you can not implement permissions on these files.
  The discussion was about serving files from a place outside any docroot!
  Guess there is a reason for that.
 
 
  One more thing added: your RAM will be exhausted even if you open that
  600mb
  file just once.
  Apaches memory handling is a bit weird: if *one* apache process is
 using
  200mb RAM on *one* impression because your application uses that much,
  then
  that process will not release the memory while it's serving another
 1000
  requests for `clear.gif` which is maybe 850b in size.
  again everything depends on how you have your server configured; you
 can
  easily tell apache to kill each child after one run or a whole host of
  other configs; but ultimately if you can avoid opening up that file in
  php then do; serving statically as above is the cleanest quickest way
 to
  do it (other than using s3 or similar).
 
  regards!
 
 
  Sure, you could configure your apache like that. Unless you have some
  traffic on your site, because the time intensive thing for apache is to
  spawn new processes. So it's just not a good idea to do that, Nor to
 serve
  big files via file_get_contents.
 
  was only addressing and issue you pointed out.. anyways.. so you propose
  what exactly? don't server via apache, don't use file_get_contents
  instead do..?
 
  ps you do realise that virtually every huge file on the net is served
  via a web server w/o problems yeah?
 
 

 I was recommending other file methods like fopen() combinations,
 fpassthru() and at best readfile(). All of them do not buffer the
 whole file in memory.

 http://php.net/readfile
 http://php.net/fpassthru

 Regards



--
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php


I wanted to see the diff between the memory usage of following three methods
in PHP.
1. readfile
2. fopen followed by fpassthru, and
3. file_get_contents


Re: [PHP] how to provide download of files mow in documentroot

2010-03-30 Thread Nathan Rixham
Anshul Agrawal wrote:
 On Tue, Mar 30, 2010 at 8:41 PM, Jan G.B. ro0ot.w...@googlemail.com wrote:
 
 2010/3/30 Nathan Rixham nrix...@gmail.com:
 Jan G.B. wrote:
 2010/3/29 Nathan Rixham nrix...@gmail.com

 Jan G.B. wrote:
 2010/3/29 Nathan Rixham nrix...@gmail.com

 Jan G.B. wrote:
 Top posting sucks, so I'll answer the post somewhere down there.
 SCNR

 2010/3/29 Devendra Jadhav devendra...@gmail.com

 Then you can do file_get_contents within PHP. or any file handling
 mechanism.
 On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
 Hi
 i am writing a web application in php
 this webapp primarily focuses on file uploads and downloads
 the uploaded files will be saved in a folder which is not in
 document
 root
 and my query is how will i be able to provide download to such
 files
 not
 located in document root via php

 Try something like that
 ?php
 $content = file_get_contents($filename);
 $etag = md5($content);
 header('Last-Modified: '.gmdate('D, d M Y H:i:s',
 filemtime($filename)).' GMT');
 header('ETag: '.$etag);
 header('Accept-Ranges: bytes');
 header('Content-Length: '.strlen($content));
 header('Cache-Control: '.$cache_value); // you decide
 header('Content-type: '.$should_be_set);
 echo $content;
 exit;
 ?

 Depending on the $filesize, you should use something else than
 file_get_contents() (for example fopen/fread). file_get_contents on
 a
 huge
 file will exhaust your webservers RAM.
 Yup, so you can map the Directory /path/to in web server config;
 then
 allow from only from localhost + yourdomain. This means you can
 then
 request it like an url and do a head request to get the etag etc then
 return a 304 not modified if you received a matching etag
 Last-Modified
 etc; (thus meaning you only file_get_contents when really really
 needed).
 I'd advise against saying you Accept-Ranges bytes if you don't accept
 byte ranges (ie you aren't going to send little bits of the file).

 If you need the downloads to be secure only; then you could easily
 negate php all together and simply expose the directory via a
 location
 so that it is web accessible and set it up to ask for auth using
 htpasswd; a custom script, ldap or whatever.

 And if you don't need security then why have php involved at all?
 simply
 symlink to the directory or expose it via http and be done with the
 problem in a minute or two.

 Regards!

 In my opinion, serving user-content on a productive server is wicked
 sick.
 You don't want your visitors to upload malicous files that may trigger
 some
 modules as mod_php in apache. So it makes sense to store user-uploads
 outside of a docroot and with no symlink or whatsover.
 even the simplest of server configurations will ensure safety. just use
 .htaccess to SetHandler default-handler which treats everything as
 static content and serves it right up.

 Yes. But the average persons posting here aren't server config gods, I
 believe.
 Also, you can not implement permissions on these files.
 The discussion was about serving files from a place outside any docroot!
 Guess there is a reason for that.


 One more thing added: your RAM will be exhausted even if you open that
 600mb
 file just once.
 Apaches memory handling is a bit weird: if *one* apache process is
 using
 200mb RAM on *one* impression because your application uses that much,
 then
 that process will not release the memory while it's serving another
 1000
 requests for `clear.gif` which is maybe 850b in size.
 again everything depends on how you have your server configured; you
 can
 easily tell apache to kill each child after one run or a whole host of
 other configs; but ultimately if you can avoid opening up that file in
 php then do; serving statically as above is the cleanest quickest way
 to
 do it (other than using s3 or similar).

 regards!

 Sure, you could configure your apache like that. Unless you have some
 traffic on your site, because the time intensive thing for apache is to
 spawn new processes. So it's just not a good idea to do that, Nor to
 serve
 big files via file_get_contents.
 was only addressing and issue you pointed out.. anyways.. so you propose
 what exactly? don't server via apache, don't use file_get_contents
 instead do..?

 ps you do realise that virtually every huge file on the net is served
 via a web server w/o problems yeah?


 I was recommending other file methods like fopen() combinations,
 fpassthru() and at best readfile(). All of them do not buffer the
 whole file in memory.

 http://php.net/readfile
 http://php.net/fpassthru

 Regards



 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php


 I wanted to see the diff between the memory usage of following three methods
 in PHP.
 1. readfile
 2. fopen followed by fpassthru, and
 3. file_get_contents
 
 Using xdebug trace, all three of them gave same number. With
 memory_get_peak_usage(true) 

Re: [PHP] how to provide download of files mow in documentroot

2010-03-29 Thread Devendra Jadhav
Hey..

Try creating soft link to the destination folder from doc root.
I haven't tried it but give it a try...


On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:

 Hi
 i am writing a web application in php
 this webapp primarily focuses on file uploads and downloads
 the uploaded files will be saved in a folder which is not in document root
 and my query is how will i be able to provide download to such files not
 located in document root via php


 --
 Bhaskar Tiwari
 GTSE Generalist
 Directory Services
 Microsoft

 
 All we have to decide is what to do with the time that has been given to us


 http://www.ebhakt.com/
 http://fytclub.net/
 http://ebhakt.info/




-- 
Devendra Jadhav
देवेंद्र जाधव


Re: [PHP] how to provide download of files mow in documentroot

2010-03-29 Thread ebhakt
No i don't want to create any soft links
that primarily rejects all the benefits of putting a file outside of
document root

i want some solution similar to private file downloads provided by drupal'

so that the php webserver provides the download and not apache
in realtime


On Mon, Mar 29, 2010 at 11:43 AM, Devendra Jadhav devendra...@gmail.comwrote:

 Hey..

 Try creating soft link to the destination folder from doc root.
 I haven't tried it but give it a try...


 On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:

 Hi
 i am writing a web application in php
 this webapp primarily focuses on file uploads and downloads
 the uploaded files will be saved in a folder which is not in document root
 and my query is how will i be able to provide download to such files not
 located in document root via php


 --
 Bhaskar Tiwari
 GTSE Generalist
 Directory Services
 Microsoft

 
 All we have to decide is what to do with the time that has been given to
 us


 http://www.ebhakt.com/
 http://fytclub.net/
 http://ebhakt.info/




 --
 Devendra Jadhav
 देवेंद्र जाधव




-- 
Bhaskar Tiwari
GTSE Generalist
Directory Services
Microsoft


All we have to decide is what to do with the time that has been given to us


http://www.ebhakt.com/
http://fytclub.net/
http://ebhakt.info/


Re: [PHP] how to provide download of files mow in documentroot

2010-03-29 Thread Devendra Jadhav
Then you can do file_get_contents within PHP. or any file handling
mechanism.


On Mon, Mar 29, 2010 at 11:49 AM, ebhakt i...@ebhakt.com wrote:

 No i don't want to create any soft links
 that primarily rejects all the benefits of putting a file outside of
 document root

 i want some solution similar to private file downloads provided by drupal'

 so that the php webserver provides the download and not apache
 in realtime



 On Mon, Mar 29, 2010 at 11:43 AM, Devendra Jadhav 
 devendra...@gmail.comwrote:

 Hey..

 Try creating soft link to the destination folder from doc root.
 I haven't tried it but give it a try...


 On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:

 Hi
 i am writing a web application in php
 this webapp primarily focuses on file uploads and downloads
 the uploaded files will be saved in a folder which is not in document
 root
 and my query is how will i be able to provide download to such files not
 located in document root via php


 --
 Bhaskar Tiwari
 GTSE Generalist
 Directory Services
 Microsoft

 
 All we have to decide is what to do with the time that has been given to
 us


 http://www.ebhakt.com/
 http://fytclub.net/
 http://ebhakt.info/




 --
 Devendra Jadhav
 देवेंद्र जाधव




 --
 Bhaskar Tiwari
 GTSE Generalist
 Directory Services
 Microsoft

 
 All we have to decide is what to do with the time that has been given to us


 http://www.ebhakt.com/
 http://fytclub.net/
 http://ebhakt.info/





-- 
Devendra Jadhav
देवेंद्र जाधव


Re: [PHP] how to provide download of files mow in documentroot

2010-03-29 Thread Jan G.B.
Top posting sucks, so I'll answer the post somewhere down there.
SCNR

2010/3/29 Devendra Jadhav devendra...@gmail.com

 Then you can do file_get_contents within PHP. or any file handling
 mechanism.
  On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
  Hi
  i am writing a web application in php
  this webapp primarily focuses on file uploads and downloads
  the uploaded files will be saved in a folder which is not in document
  root
  and my query is how will i be able to provide download to such files
 not
  located in document root via php
 


Try something like that
?php
$content = file_get_contents($filename);
$etag = md5($content);
header('Last-Modified: '.gmdate('D, d M Y H:i:s',
filemtime($filename)).' GMT');
header('ETag: '.$etag);
header('Accept-Ranges: bytes');
header('Content-Length: '.strlen($content));
header('Cache-Control: '.$cache_value); // you decide
header('Content-type: '.$should_be_set);
echo $content;
exit;
?

Depending on the $filesize, you should use something else than
file_get_contents() (for example fopen/fread). file_get_contents on a huge
file will exhaust your webservers RAM.

Regards


Re: [PHP] how to provide download of files mow in documentroot

2010-03-29 Thread Nathan Rixham
Jan G.B. wrote:
 Top posting sucks, so I'll answer the post somewhere down there.
 SCNR
 
 2010/3/29 Devendra Jadhav devendra...@gmail.com
 
 Then you can do file_get_contents within PHP. or any file handling
 mechanism.
 On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
 Hi
 i am writing a web application in php
 this webapp primarily focuses on file uploads and downloads
 the uploaded files will be saved in a folder which is not in document
 root
 and my query is how will i be able to provide download to such files
 not
 located in document root via php

 
 Try something like that
 ?php
 $content = file_get_contents($filename);
 $etag = md5($content);
 header('Last-Modified: '.gmdate('D, d M Y H:i:s',
 filemtime($filename)).' GMT');
 header('ETag: '.$etag);
 header('Accept-Ranges: bytes');
 header('Content-Length: '.strlen($content));
 header('Cache-Control: '.$cache_value); // you decide
 header('Content-type: '.$should_be_set);
 echo $content;
 exit;
 ?
 
 Depending on the $filesize, you should use something else than
 file_get_contents() (for example fopen/fread). file_get_contents on a huge
 file will exhaust your webservers RAM.

Yup, so you can map the Directory /path/to in web server config; then
allow from only from localhost + yourdomain. This means you can then
request it like an url and do a head request to get the etag etc then
return a 304 not modified if you received a matching etag Last-Modified
etc; (thus meaning you only file_get_contents when really really needed).

I'd advise against saying you Accept-Ranges bytes if you don't accept
byte ranges (ie you aren't going to send little bits of the file).

If you need the downloads to be secure only; then you could easily
negate php all together and simply expose the directory via a location
so that it is web accessible and set it up to ask for auth using
htpasswd; a custom script, ldap or whatever.

And if you don't need security then why have php involved at all? simply
symlink to the directory or expose it via http and be done with the
problem in a minute or two.

Regards!

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



Re: [PHP] how to provide download of files mow in documentroot

2010-03-29 Thread Jan G.B.
2010/3/29 Nathan Rixham nrix...@gmail.com

 Jan G.B. wrote:
  Top posting sucks, so I'll answer the post somewhere down there.
  SCNR
 
  2010/3/29 Devendra Jadhav devendra...@gmail.com
 
  Then you can do file_get_contents within PHP. or any file handling
  mechanism.
  On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
  Hi
  i am writing a web application in php
  this webapp primarily focuses on file uploads and downloads
  the uploaded files will be saved in a folder which is not in document
  root
  and my query is how will i be able to provide download to such files
  not
  located in document root via php
 
 
  Try something like that
  ?php
  $content = file_get_contents($filename);
  $etag = md5($content);
  header('Last-Modified: '.gmdate('D, d M Y H:i:s',
  filemtime($filename)).' GMT');
  header('ETag: '.$etag);
  header('Accept-Ranges: bytes');
  header('Content-Length: '.strlen($content));
  header('Cache-Control: '.$cache_value); // you decide
  header('Content-type: '.$should_be_set);
  echo $content;
  exit;
  ?
 
  Depending on the $filesize, you should use something else than
  file_get_contents() (for example fopen/fread). file_get_contents on a
 huge
  file will exhaust your webservers RAM.

 Yup, so you can map the Directory /path/to in web server config; then
 allow from only from localhost + yourdomain. This means you can then
 request it like an url and do a head request to get the etag etc then
 return a 304 not modified if you received a matching etag Last-Modified
 etc; (thus meaning you only file_get_contents when really really needed).

 I'd advise against saying you Accept-Ranges bytes if you don't accept
 byte ranges (ie you aren't going to send little bits of the file).

 If you need the downloads to be secure only; then you could easily
 negate php all together and simply expose the directory via a location
 so that it is web accessible and set it up to ask for auth using
 htpasswd; a custom script, ldap or whatever.

 And if you don't need security then why have php involved at all? simply
 symlink to the directory or expose it via http and be done with the
 problem in a minute or two.

 Regards!


In my opinion, serving user-content on a productive server is wicked sick.
You don't want your visitors to upload malicous files that may trigger some
modules as mod_php in apache. So it makes sense to store user-uploads
outside of a docroot and with no symlink or whatsover.

One more thing added: your RAM will be exhausted even if you open that 600mb
file just once.
Apaches memory handling is a bit weird: if *one* apache process is using
200mb RAM on *one* impression because your application uses that much, then
that process will not release the memory while it's serving another 1000
requests for `clear.gif` which is maybe 850b in size.
So better forget that file_get_contents)( when the filesize can be huge. :-)

Regards


Re: [PHP] how to provide download of files mow in documentroot

2010-03-29 Thread Anshul Agrawal
On Mon, Mar 29, 2010 at 6:10 PM, Nathan Rixham nrix...@gmail.com wrote:

 Jan G.B. wrote:
  Top posting sucks, so I'll answer the post somewhere down there.
  SCNR
 
  2010/3/29 Devendra Jadhav devendra...@gmail.com
 
  Then you can do file_get_contents within PHP. or any file handling
  mechanism.
  On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
  Hi
  i am writing a web application in php
  this webapp primarily focuses on file uploads and downloads
  the uploaded files will be saved in a folder which is not in document
  root
  and my query is how will i be able to provide download to such files
  not
  located in document root via php
 
 
  Try something like that
  ?php
  $content = file_get_contents($filename);
  $etag = md5($content);
  header('Last-Modified: '.gmdate('D, d M Y H:i:s',
  filemtime($filename)).' GMT');
  header('ETag: '.$etag);
  header('Accept-Ranges: bytes');
  header('Content-Length: '.strlen($content));
  header('Cache-Control: '.$cache_value); // you decide
  header('Content-type: '.$should_be_set);
  echo $content;
  exit;
  ?
 
  Depending on the $filesize, you should use something else than
  file_get_contents() (for example fopen/fread). file_get_contents on a
 huge
  file will exhaust your webservers RAM.

 Yup, so you can map the Directory /path/to in web server config; then
 allow from only from localhost + yourdomain. This means you can then
 request it like an url and do a head request to get the etag etc then
 return a 304 not modified if you received a matching etag Last-Modified
 etc; (thus meaning you only file_get_contents when really really needed).

 I'd advise against saying you Accept-Ranges bytes if you don't accept
 byte ranges (ie you aren't going to send little bits of the file).

 If you need the downloads to be secure only; then you could easily
 negate php all together and simply expose the directory via a location
 so that it is web accessible and set it up to ask for auth using
 htpasswd; a custom script, ldap or whatever.

 And if you don't need security then why have php involved at all? simply
 symlink to the directory or expose it via http and be done with the
 problem in a minute or two.

 Regards!

 --
 PHP General Mailing List (http://www.php.net/)
 To unsubscribe, visit: http://www.php.net/unsub.php


Also look at readfile() and fpassthru if dealing with large files.

Moreover, if you have control over the webserver then you can use PHP only
for authenticating the getFile request and offload the file delivery
operation to your webserver (Apache, NginX, lighttpd) using X-SendFile
header in the response.

Best,
Anshul


Re: [PHP] how to provide download of files mow in documentroot

2010-03-29 Thread Nathan Rixham
Jan G.B. wrote:
 2010/3/29 Nathan Rixham nrix...@gmail.com
 
 Jan G.B. wrote:
 Top posting sucks, so I'll answer the post somewhere down there.
 SCNR

 2010/3/29 Devendra Jadhav devendra...@gmail.com

 Then you can do file_get_contents within PHP. or any file handling
 mechanism.
 On Mon, Mar 29, 2010 at 1:00 AM, ebhakt i...@ebhakt.com wrote:
 Hi
 i am writing a web application in php
 this webapp primarily focuses on file uploads and downloads
 the uploaded files will be saved in a folder which is not in document
 root
 and my query is how will i be able to provide download to such files
 not
 located in document root via php

 Try something like that
 ?php
 $content = file_get_contents($filename);
 $etag = md5($content);
 header('Last-Modified: '.gmdate('D, d M Y H:i:s',
 filemtime($filename)).' GMT');
 header('ETag: '.$etag);
 header('Accept-Ranges: bytes');
 header('Content-Length: '.strlen($content));
 header('Cache-Control: '.$cache_value); // you decide
 header('Content-type: '.$should_be_set);
 echo $content;
 exit;
 ?

 Depending on the $filesize, you should use something else than
 file_get_contents() (for example fopen/fread). file_get_contents on a
 huge
 file will exhaust your webservers RAM.
 Yup, so you can map the Directory /path/to in web server config; then
 allow from only from localhost + yourdomain. This means you can then
 request it like an url and do a head request to get the etag etc then
 return a 304 not modified if you received a matching etag Last-Modified
 etc; (thus meaning you only file_get_contents when really really needed).

 I'd advise against saying you Accept-Ranges bytes if you don't accept
 byte ranges (ie you aren't going to send little bits of the file).

 If you need the downloads to be secure only; then you could easily
 negate php all together and simply expose the directory via a location
 so that it is web accessible and set it up to ask for auth using
 htpasswd; a custom script, ldap or whatever.

 And if you don't need security then why have php involved at all? simply
 symlink to the directory or expose it via http and be done with the
 problem in a minute or two.

 Regards!

 
 In my opinion, serving user-content on a productive server is wicked sick.
 You don't want your visitors to upload malicous files that may trigger some
 modules as mod_php in apache. So it makes sense to store user-uploads
 outside of a docroot and with no symlink or whatsover.

even the simplest of server configurations will ensure safety. just use
.htaccess to SetHandler default-handler which treats everything as
static content and serves it right up.

 One more thing added: your RAM will be exhausted even if you open that 600mb
 file just once.
 Apaches memory handling is a bit weird: if *one* apache process is using
 200mb RAM on *one* impression because your application uses that much, then
 that process will not release the memory while it's serving another 1000
 requests for `clear.gif` which is maybe 850b in size.

again everything depends on how you have your server configured; you can
easily tell apache to kill each child after one run or a whole host of
other configs; but ultimately if you can avoid opening up that file in
php then do; serving statically as above is the cleanest quickest way to
do it (other than using s3 or similar).

regards!

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php