Re: [PHP-DEV] Bug #10701 Updated: readfile usage on large files

2001-05-18 Thread Wez Furlong

On 2001-05-18 05:28:19, [EMAIL PROTECTED] wrote:
 ID: 10701
 Status: Closed
 But a hint.  Don't use readfile(), fopen() the file and read it a bit
at
a time instead of sticking the entire thing in memory.

Does readfile() really read the whole thing into memory first??

Perhaps we should change it so that it reads chunks and then spits them
out; it won't break any scripts and will make PHP more memory efficient as
that guy suggested.

Or am I missing something?

--Wez.


-- 
PHP Development Mailing List http://www.php.net/
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




[PHP-DEV] Bug #10701 Updated: readfile usage on large files

2001-05-18 Thread sniper

ID: 10701
Updated by: sniper
Reported By: [EMAIL PROTECTED]
Old-Status: Closed
Status: Open
Bug Type: Filesystem function related
Operating system: 
PHP Version: 4.0.5
Assigned To: 
Comments:

Reopened. This is not fixed.

--Jani


Previous Comments:
---

[2001-05-18 00:51:27] [EMAIL PROTECTED]
You are kidding right? Nice way to take down a server.

load average: 66.52, 33.25, 15.76

$fp = fopen(/web/sites/contentsite/.$f);
while(!feof($fp))
   {
   echo fgets($fd, 4096);
   }
fclose($fp);

---

[2001-05-18 00:28:19] [EMAIL PROTECTED]
What's to claim.  This is a support question that belongs on one of the mailing lists 
and not in the bug database.

But a hint.  Don't use readfile(), fopen() the file and read it a bit at a time 
instead of sticking the entire thing in memory.

---

[2001-05-17 17:29:33] [EMAIL PROTECTED]
Anyone plan on claiming this?

---

[2001-05-07 07:34:13] [EMAIL PROTECTED]
Ok, this is a pretty intersting one, and I'm not even sure if it's a bug or just a 
memory limit thing. I am using a php wrapper for content files 
(.jpg|.gif|.asf|.mov|.ram), as you can guess, the movie files can be pretty large, 
upwards of 10MB, all getting read into memory, then spit back out. Couple that with a 
site that gets 10K+ visits a day and the server is on its knees with a load average of 
about 10 (when it gets to 25/30 things start swapping and it will dies not long after 
that.)

here's the code used to wrap the content:

?php
$mime_type = strtolower(strrchr($f,'.'));
$mime_type_array = array(
'.asf' = 'application/vnd.ms-asf',
'.jpg' = 'image/jpeg',
'.gif' = 'image/gif'
);

if(!in_array($mime_type,array_keys($mime_type_array)))
{
header(Location: /error.php);
}

$offset = 86400 * 3;
header(Expires: .gmdate(D, d M Y H:i:s GMT, time() + $offset));
header(Cache-Control: max-age=.$offset., must-revalidate);
header(Last-modified : .gmdate(D, d M Y H:i:s GMT, 
filemtime(/web/sites/contentsite/.$f)));
header(Content-type: .$mime_type_array[$mime_type]);
header(Content-length: .filesize(/web/sites/contentsite/.$f));
@readfile(/web/sites/contentsite/.$f);
?

so, I would pass an image or movie to the content file with a url like so:

http://contentsite.com/content.php?f=movies/bigmovie.asf


This is really just a heads up at this point, I know it will take you guys a little 
while to sort through this one, I'm not even sure it's a bug considering readfile is 
SUPPOSED to read a file into memory and spit it back out. I dunno, for now I'm going 
to do some .htaccess tricks where I force php to parse .htaccess files. If anyone has 
come across this or has any insight on wrapping content in php files, please email me 
at [EMAIL PROTECTED]

Thanks!
Stephen VanDyke

PS - aside from that, great language, I love PHP :)

---



ATTENTION! Do NOT reply to this email!
To reply, use the web interface found at http://bugs.php.net/?id=10701edit=2


-- 
PHP Development Mailing List http://www.php.net/
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




[PHP-DEV] Bug #10701 Updated: readfile usage on large files

2001-05-18 Thread sas

ID: 10701
Updated by: sas
Reported By: [EMAIL PROTECTED]
Old-Status: Open
Status: Closed
Bug Type: Filesystem function related
Operating system: 
PHP Version: 4.0.5
Assigned To: 
Comments:

It is almost impossible to tell from the output of system commands like top why 
certain processes have such a huge memory footprint and what kind of memory they are 
associated with.

In this case, you get the impression that the processes use much memory, because PHP 
maps files into the web-server's address space and sends them out in one quick move. 
These maps do not cause swapping though, as they can be removed  from physical RAM 
without any swapping.

Your problem is probably caused by using Apache for serving large static files. Every 
download ties up one Apache instance and hence the available RAM limits the number of 
users you can serve. Switching the file-server to e.g. thttpd would free up 
significant resources (RAM and CPU time).

Previous Comments:
---

[2001-05-18 11:10:47] [EMAIL PROTECTED]
Reopened. This is not fixed.

--Jani


---

[2001-05-18 00:51:27] [EMAIL PROTECTED]
You are kidding right? Nice way to take down a server.

load average: 66.52, 33.25, 15.76

$fp = fopen(/web/sites/contentsite/.$f);
while(!feof($fp))
   {
   echo fgets($fd, 4096);
   }
fclose($fp);

---

[2001-05-18 00:28:19] [EMAIL PROTECTED]
What's to claim.  This is a support question that belongs on one of the mailing lists 
and not in the bug database.

But a hint.  Don't use readfile(), fopen() the file and read it a bit at a time 
instead of sticking the entire thing in memory.

---

[2001-05-17 17:29:33] [EMAIL PROTECTED]
Anyone plan on claiming this?

---

[2001-05-07 07:34:13] [EMAIL PROTECTED]
Ok, this is a pretty intersting one, and I'm not even sure if it's a bug or just a 
memory limit thing. I am using a php wrapper for content files 
(.jpg|.gif|.asf|.mov|.ram), as you can guess, the movie files can be pretty large, 
upwards of 10MB, all getting read into memory, then spit back out. Couple that with a 
site that gets 10K+ visits a day and the server is on its knees with a load average of 
about 10 (when it gets to 25/30 things start swapping and it will dies not long after 
that.)

here's the code used to wrap the content:

?php
$mime_type = strtolower(strrchr($f,'.'));
$mime_type_array = array(
'.asf' = 'application/vnd.ms-asf',
'.jpg' = 'image/jpeg',
'.gif' = 'image/gif'
);

if(!in_array($mime_type,array_keys($mime_type_array)))
{
header(Location: /error.php);
}

$offset = 86400 * 3;
header(Expires: .gmdate(D, d M Y H:i:s GMT, time() + $offset));
header(Cache-Control: max-age=.$offset., must-revalidate);
header(Last-modified : .gmdate(D, d M Y H:i:s GMT, 
filemtime(/web/sites/contentsite/.$f)));
header(Content-type: .$mime_type_array[$mime_type]);
header(Content-length: .filesize(/web/sites/contentsite/.$f));
@readfile(/web/sites/contentsite/.$f);
?

so, I would pass an image or movie to the content file with a url like so:

http://contentsite.com/content.php?f=movies/bigmovie.asf


This is really just a heads up at this point, I know it will take you guys a little 
while to sort through this one, I'm not even sure it's a bug considering readfile is 
SUPPOSED to read a file into memory and spit it back out. I dunno, for now I'm going 
to do some .htaccess tricks where I force php to parse .htaccess files. If anyone has 
come across this or has any insight on wrapping content in php files, please email me 
at [EMAIL PROTECTED]

Thanks!
Stephen VanDyke

PS - aside from that, great language, I love PHP :)

---

The remainder of the comments for this report are too long.  To view the rest of the 
comments, please view the bug report online.


ATTENTION! Do NOT reply to this email!
To reply, use the web interface found at http://bugs.php.net/?id=10701edit=2


-- 
PHP Development Mailing List http://www.php.net/
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




[PHP-DEV] Bug #10701 Updated: readfile usage on large files

2001-05-17 Thread stephen

ID: 10701
User Update by: [EMAIL PROTECTED]
Status: Open
Bug Type: Filesystem function related
Operating system: Linux 2.4.x
PHP Version: 4.0.5
Description: readfile usage on large files

Anyone plan on claiming this?

Previous Comments:
---

[2001-05-07 07:34:13] [EMAIL PROTECTED]
Ok, this is a pretty intersting one, and I'm not even sure if it's a bug or just a 
memory limit thing. I am using a php wrapper for content files 
(.jpg|.gif|.asf|.mov|.ram), as you can guess, the movie files can be pretty large, 
upwards of 10MB, all getting read into memory, then spit back out. Couple that with a 
site that gets 10K+ visits a day and the server is on its knees with a load average of 
about 10 (when it gets to 25/30 things start swapping and it will dies not long after 
that.)

here's the code used to wrap the content:

?php
$mime_type = strtolower(strrchr($f,'.'));
$mime_type_array = array(
'.asf' = 'application/vnd.ms-asf',
'.jpg' = 'image/jpeg',
'.gif' = 'image/gif'
);

if(!in_array($mime_type,array_keys($mime_type_array)))
{
header(Location: /error.php);
}

$offset = 86400 * 3;
header(Expires: .gmdate(D, d M Y H:i:s GMT, time() + $offset));
header(Cache-Control: max-age=.$offset., must-revalidate);
header(Last-modified : .gmdate(D, d M Y H:i:s GMT, 
filemtime(/web/sites/contentsite/.$f)));
header(Content-type: .$mime_type_array[$mime_type]);
header(Content-length: .filesize(/web/sites/contentsite/.$f));
@readfile(/web/sites/contentsite/.$f);
?

so, I would pass an image or movie to the content file with a url like so:

http://contentsite.com/content.php?f=movies/bigmovie.asf


This is really just a heads up at this point, I know it will take you guys a little 
while to sort through this one, I'm not even sure it's a bug considering readfile is 
SUPPOSED to read a file into memory and spit it back out. I dunno, for now I'm going 
to do some .htaccess tricks where I force php to parse .htaccess files. If anyone has 
come across this or has any insight on wrapping content in php files, please email me 
at [EMAIL PROTECTED]

Thanks!
Stephen VanDyke

PS - aside from that, great language, I love PHP :)

---


Full Bug description available at: http://bugs.php.net/?id=10701


-- 
PHP Development Mailing List http://www.php.net/
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




[PHP-DEV] Bug #10701 Updated: readfile usage on large files

2001-05-17 Thread rasmus

ID: 10701
Updated by: rasmus
Reported By: [EMAIL PROTECTED]
Old-Status: Open
Status: Closed
Bug Type: Filesystem function related
Operating system: 
PHP Version: 4.0.5
Assigned To: 
Comments:

What's to claim.  This is a support question that belongs on one of the mailing lists 
and not in the bug database.

But a hint.  Don't use readfile(), fopen() the file and read it a bit at a time 
instead of sticking the entire thing in memory.

Previous Comments:
---

[2001-05-17 17:29:33] [EMAIL PROTECTED]
Anyone plan on claiming this?

---

[2001-05-07 07:34:13] [EMAIL PROTECTED]
Ok, this is a pretty intersting one, and I'm not even sure if it's a bug or just a 
memory limit thing. I am using a php wrapper for content files 
(.jpg|.gif|.asf|.mov|.ram), as you can guess, the movie files can be pretty large, 
upwards of 10MB, all getting read into memory, then spit back out. Couple that with a 
site that gets 10K+ visits a day and the server is on its knees with a load average of 
about 10 (when it gets to 25/30 things start swapping and it will dies not long after 
that.)

here's the code used to wrap the content:

?php
$mime_type = strtolower(strrchr($f,'.'));
$mime_type_array = array(
'.asf' = 'application/vnd.ms-asf',
'.jpg' = 'image/jpeg',
'.gif' = 'image/gif'
);

if(!in_array($mime_type,array_keys($mime_type_array)))
{
header(Location: /error.php);
}

$offset = 86400 * 3;
header(Expires: .gmdate(D, d M Y H:i:s GMT, time() + $offset));
header(Cache-Control: max-age=.$offset., must-revalidate);
header(Last-modified : .gmdate(D, d M Y H:i:s GMT, 
filemtime(/web/sites/contentsite/.$f)));
header(Content-type: .$mime_type_array[$mime_type]);
header(Content-length: .filesize(/web/sites/contentsite/.$f));
@readfile(/web/sites/contentsite/.$f);
?

so, I would pass an image or movie to the content file with a url like so:

http://contentsite.com/content.php?f=movies/bigmovie.asf


This is really just a heads up at this point, I know it will take you guys a little 
while to sort through this one, I'm not even sure it's a bug considering readfile is 
SUPPOSED to read a file into memory and spit it back out. I dunno, for now I'm going 
to do some .htaccess tricks where I force php to parse .htaccess files. If anyone has 
come across this or has any insight on wrapping content in php files, please email me 
at [EMAIL PROTECTED]

Thanks!
Stephen VanDyke

PS - aside from that, great language, I love PHP :)

---



ATTENTION! Do NOT reply to this email!
To reply, use the web interface found at http://bugs.php.net/?id=10701edit=2


-- 
PHP Development Mailing List http://www.php.net/
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




[PHP-DEV] Bug #10701 Updated: readfile usage on large files

2001-05-17 Thread stephen

ID: 10701
User Update by: [EMAIL PROTECTED]
Status: Closed
Bug Type: Filesystem function related
Operating system: Linux 2.4.x
PHP Version: 4.0.5
Description: readfile usage on large files

You are kidding right? Nice way to take down a server.

load average: 66.52, 33.25, 15.76

$fp = fopen(/web/sites/contentsite/.$f);
while(!feof($fp))
   {
   echo fgets($fd, 4096);
   }
fclose($fp);

Previous Comments:
---

[2001-05-18 00:28:19] [EMAIL PROTECTED]
What's to claim.  This is a support question that belongs on one of the mailing lists 
and not in the bug database.

But a hint.  Don't use readfile(), fopen() the file and read it a bit at a time 
instead of sticking the entire thing in memory.

---

[2001-05-17 17:29:33] [EMAIL PROTECTED]
Anyone plan on claiming this?

---

[2001-05-07 07:34:13] [EMAIL PROTECTED]
Ok, this is a pretty intersting one, and I'm not even sure if it's a bug or just a 
memory limit thing. I am using a php wrapper for content files 
(.jpg|.gif|.asf|.mov|.ram), as you can guess, the movie files can be pretty large, 
upwards of 10MB, all getting read into memory, then spit back out. Couple that with a 
site that gets 10K+ visits a day and the server is on its knees with a load average of 
about 10 (when it gets to 25/30 things start swapping and it will dies not long after 
that.)

here's the code used to wrap the content:

?php
$mime_type = strtolower(strrchr($f,'.'));
$mime_type_array = array(
'.asf' = 'application/vnd.ms-asf',
'.jpg' = 'image/jpeg',
'.gif' = 'image/gif'
);

if(!in_array($mime_type,array_keys($mime_type_array)))
{
header(Location: /error.php);
}

$offset = 86400 * 3;
header(Expires: .gmdate(D, d M Y H:i:s GMT, time() + $offset));
header(Cache-Control: max-age=.$offset., must-revalidate);
header(Last-modified : .gmdate(D, d M Y H:i:s GMT, 
filemtime(/web/sites/contentsite/.$f)));
header(Content-type: .$mime_type_array[$mime_type]);
header(Content-length: .filesize(/web/sites/contentsite/.$f));
@readfile(/web/sites/contentsite/.$f);
?

so, I would pass an image or movie to the content file with a url like so:

http://contentsite.com/content.php?f=movies/bigmovie.asf


This is really just a heads up at this point, I know it will take you guys a little 
while to sort through this one, I'm not even sure it's a bug considering readfile is 
SUPPOSED to read a file into memory and spit it back out. I dunno, for now I'm going 
to do some .htaccess tricks where I force php to parse .htaccess files. If anyone has 
come across this or has any insight on wrapping content in php files, please email me 
at [EMAIL PROTECTED]

Thanks!
Stephen VanDyke

PS - aside from that, great language, I love PHP :)

---


Full Bug description available at: http://bugs.php.net/?id=10701


-- 
PHP Development Mailing List http://www.php.net/
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




Re: [PHP-DEV] Bug #10701 Updated: readfile usage on large files

2001-05-17 Thread Rasmus Lerdorf

Well, you'd want to do it one block at a time.  But yes, if you are going
to be reading the files with PHP that's what you'll end up doing at some
level anyway.  Otherwise look at Apache's mod_header and perhaps
dynamically generate the header information and write the appropriate
.htaccess files or something...  But now we are well beyond having
anything to do with PHP.

-Rasmus

On 18 May 2001 [EMAIL PROTECTED] wrote:

 ID: 10701
 User Update by: [EMAIL PROTECTED]
 Status: Closed
 Bug Type: Filesystem function related
 Operating system: Linux 2.4.x
 PHP Version: 4.0.5
 Description: readfile usage on large files

 You are kidding right? Nice way to take down a server.

 load average: 66.52, 33.25, 15.76

 $fp = fopen(/web/sites/contentsite/.$f);
 while(!feof($fp))
{
echo fgets($fd, 4096);
}
 fclose($fp);

 Previous Comments:
 ---

 [2001-05-18 00:28:19] [EMAIL PROTECTED]
 What's to claim.  This is a support question that belongs on one of the mailing 
lists and not in the bug database.

 But a hint.  Don't use readfile(), fopen() the file and read it a bit at a time 
instead of sticking the entire thing in memory.

 ---

 [2001-05-17 17:29:33] [EMAIL PROTECTED]
 Anyone plan on claiming this?

 ---

 [2001-05-07 07:34:13] [EMAIL PROTECTED]
 Ok, this is a pretty intersting one, and I'm not even sure if it's a bug or just a 
memory limit thing. I am using a php wrapper for content files 
(.jpg|.gif|.asf|.mov|.ram), as you can guess, the movie files can be pretty large, 
upwards of 10MB, all getting read into memory, then spit back out. Couple that with a 
site that gets 10K+ visits a day and the server is on its knees with a load average 
of about 10 (when it gets to 25/30 things start swapping and it will dies not long 
after that.)

 here's the code used to wrap the content:

 ?php
 $mime_type = strtolower(strrchr($f,'.'));
 $mime_type_array = array(
 '.asf' = 'application/vnd.ms-asf',
 '.jpg' = 'image/jpeg',
 '.gif' = 'image/gif'
 );

 if(!in_array($mime_type,array_keys($mime_type_array)))
 {
 header(Location: /error.php);
 }

 $offset = 86400 * 3;
 header(Expires: .gmdate(D, d M Y H:i:s GMT, time() + $offset));
 header(Cache-Control: max-age=.$offset., must-revalidate);
 header(Last-modified : .gmdate(D, d M Y H:i:s GMT, 
filemtime(/web/sites/contentsite/.$f)));
 header(Content-type: .$mime_type_array[$mime_type]);
 header(Content-length: .filesize(/web/sites/contentsite/.$f));
 @readfile(/web/sites/contentsite/.$f);
 ?

 so, I would pass an image or movie to the content file with a url like so:

 http://contentsite.com/content.php?f=movies/bigmovie.asf


 This is really just a heads up at this point, I know it will take you guys a little 
while to sort through this one, I'm not even sure it's a bug considering readfile is 
SUPPOSED to read a file into memory and spit it back out. I dunno, for now I'm going 
to do some .htaccess tricks where I force php to parse .htaccess files. If anyone has 
come across this or has any insight on wrapping content in php files, please email me 
at [EMAIL PROTECTED]

 Thanks!
 Stephen VanDyke

 PS - aside from that, great language, I love PHP :)

 ---


 Full Bug description available at: http://bugs.php.net/?id=10701





-- 
PHP Development Mailing List http://www.php.net/
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




RE: [PHP-DEV] Bug #10701 Updated: readfile usage on large files

2001-05-17 Thread Stephen VanDyke

Well, the reason I stayed away from trying to use fgets for a single block
at a time were because of some of the comments from
http://php.net/manual/en/function.fgets.php

Anyways, an offtopic question if I may, I've tried making .htaccess
parseable by PHP using AddType and also attempting to do it by making
.htaccess a PHP-CGI file, to no avail. Is this something that is possible to
do with PHP? I have seen examples where mod_perl is used to generate
.htaccess files on the fly but nothing when it comes to PHP. If there is any
documentation in this are I would really be keen to see it.

-Stephen

-Original Message-
From: Rasmus Lerdorf [mailto:[EMAIL PROTECTED]]
Sent: Friday, May 18, 2001 12:56 AM
To: [EMAIL PROTECTED]
Cc: [EMAIL PROTECTED]
Subject: Re: [PHP-DEV] Bug #10701 Updated: readfile usage on large files


Well, you'd want to do it one block at a time.  But yes, if you are going
to be reading the files with PHP that's what you'll end up doing at some
level anyway.  Otherwise look at Apache's mod_header and perhaps
dynamically generate the header information and write the appropriate
.htaccess files or something...  But now we are well beyond having
anything to do with PHP.

-Rasmus

On 18 May 2001 [EMAIL PROTECTED] wrote:

 ID: 10701
 User Update by: [EMAIL PROTECTED]
 Status: Closed
 Bug Type: Filesystem function related
 Operating system: Linux 2.4.x
 PHP Version: 4.0.5
 Description: readfile usage on large files

 You are kidding right? Nice way to take down a server.

 load average: 66.52, 33.25, 15.76

 $fp = fopen(/web/sites/contentsite/.$f);
 while(!feof($fp))
{
echo fgets($fd, 4096);
}
 fclose($fp);

 Previous Comments:
 --
-

 [2001-05-18 00:28:19] [EMAIL PROTECTED]
 What's to claim.  This is a support question that belongs on one of the
mailing lists and not in the bug database.

 But a hint.  Don't use readfile(), fopen() the file and read it a bit at a
time instead of sticking the entire thing in memory.

 --
-

 [2001-05-17 17:29:33] [EMAIL PROTECTED]
 Anyone plan on claiming this?

 --
-

 [2001-05-07 07:34:13] [EMAIL PROTECTED]
 Ok, this is a pretty intersting one, and I'm not even sure if it's a bug
or just a memory limit thing. I am using a php wrapper for content files
(.jpg|.gif|.asf|.mov|.ram), as you can guess, the movie files can be pretty
large, upwards of 10MB, all getting read into memory, then spit back out.
Couple that with a site that gets 10K+ visits a day and the server is on its
knees with a load average of about 10 (when it gets to 25/30 things start
swapping and it will dies not long after that.)

 here's the code used to wrap the content:

 ?php
 $mime_type = strtolower(strrchr($f,'.'));
 $mime_type_array = array(
 '.asf' = 'application/vnd.ms-asf',
 '.jpg' = 'image/jpeg',
 '.gif' = 'image/gif'
 );

 if(!in_array($mime_type,array_keys($mime_type_array)))
 {
 header(Location: /error.php);
 }

 $offset = 86400 * 3;
 header(Expires: .gmdate(D, d M Y H:i:s GMT, time() + $offset));
 header(Cache-Control: max-age=.$offset., must-revalidate);
 header(Last-modified : .gmdate(D, d M Y H:i:s GMT,
filemtime(/web/sites/contentsite/.$f)));
 header(Content-type: .$mime_type_array[$mime_type]);
 header(Content-length: .filesize(/web/sites/contentsite/.$f));
 @readfile(/web/sites/contentsite/.$f);
 ?

 so, I would pass an image or movie to the content file with a url like so:

 http://contentsite.com/content.php?f=movies/bigmovie.asf


 This is really just a heads up at this point, I know it will take you guys
a little while to sort through this one, I'm not even sure it's a bug
considering readfile is SUPPOSED to read a file into memory and spit it back
out. I dunno, for now I'm going to do some .htaccess tricks where I force
php to parse .htaccess files. If anyone has come across this or has any
insight on wrapping content in php files, please email me at
[EMAIL PROTECTED]

 Thanks!
 Stephen VanDyke

 PS - aside from that, great language, I love PHP :)

 --
-


 Full Bug description available at: http://bugs.php.net/?id=10701






-- 
PHP Development Mailing List http://www.php.net/
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]




RE: [PHP-DEV] Bug #10701 Updated: readfile usage on large files

2001-05-17 Thread Rasmus Lerdorf

 Well, the reason I stayed away from trying to use fgets for a single block
 at a time were because of some of the comments from
 http://php.net/manual/en/function.fgets.php

 Anyways, an offtopic question if I may, I've tried making .htaccess
 parseable by PHP using AddType and also attempting to do it by making
 .htaccess a PHP-CGI file, to no avail. Is this something that is possible to
 do with PHP? I have seen examples where mod_perl is used to generate
 .htaccess files on the fly but nothing when it comes to PHP. If there is any
 documentation in this are I would really be keen to see it.

Why would you have PHP parse a .htaccess file?  That makes no sense.
.htaccess files are parsed by Apache even on php requests.

What I was saying is that you would generate the .htacess whenever the
header information needs to change and when the user requests a file, it
will be sent directly by Apache with mod_headers applying the custom
headers.  There is no PHP involved in this.

-Rasmus


-- 
PHP Development Mailing List http://www.php.net/
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
To contact the list administrators, e-mail: [EMAIL PROTECTED]