On 9/24/06, Ramiro <[EMAIL PROTECTED]> wrote:
Hi,
i'm trying to find a good solution to this problem. I want download files
from a directory outside DocumentRoot.

This is a standard procedure.


This files cannot be downloaded through direct url like
http://site/test.zip. It must be downloaded after user login.

This will need some sort of rewrite or 404 handling on the webserver level


I know i can do that using some functions like fread() + fopen() or
readfile(), than i would echo file buffer to browser with correct headers.
But, reading then dumping file to browser is a big problem to server.

I readfile() about 10 6mb files per-second, and havn't had any issues
with performance.


I've made one test that shows me i will "eat" 1.8% of RAM (i've used "ps
aux" at Linux, in a server with 2Gb of RAM) to download a 30Mb file at
60kb/s speed. So, imagine what a dump-php-script can do with 50 to 100
concurrently downloads. Probably i will need 1 TeraByte of RAM to provide
downloads ;)

Provide your tests, and exactly what version of php you have. a read
file or a fopen/fread wont uses that much memory unless you dont
specifly a length or specify a rather large amount for the length
param for fread()


Theres my question now. Is there other way to protect files against direct
downloading? (Obligating users to login and denying direct-url's).

Yes.


I also know i can check referer by using Mod_Rewrite at Apache. But it isn't
secure, since referer cannot be sent or be fake.

you can check it without mod_rewrite.

I think you meant to say 'it isn't secure, since the referrer can be
fake or not even sent'

So yes, this is true, this is not a security mechanism.


Curt.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to