----- Original Message ----- From: "Richard Archer" <[EMAIL PROTECTED]> To: <[EMAIL PROTECTED]> Sent: Monday, April 22, 2002 11:40 AM Subject: Re: [PHP] Database and files
> At 11:25 AM +0100 22/4/02, Danny Shepherd wrote: > > >If you get multiple requests for files, expect the db to fall over very > >quickly - what's wrong with storing them on the filesystem and having a list > >of them in the db? > > If your DB falls over, get a better one. You wouldn't do this with > Access, but MySQL or PostgreSQL will handle this with no problems. I've tried it in MySQL - it didn't work - after inserting aprox 8Mbs of data the MySQL server died with a 'server has gone away' message. And more than 2-3 users symltaneously requesting files of only a few hundred kb really seemed to kill performance. > And storing them in the file system requires the web server process to > have write access to the directory in which the files are stored. And > so will any other users on that server. Security nightmare. Having other users on your server is the security mightmare :) If you setup the webserver to have its own user ('apache' instead of 'nobody') and only allow the apache user access to those files, that should lessen the problem. >One thing - note that that the header names and the actual mimetype are in > >lower case. Got weird results with anything different. > > Interesting tip. I'll try that out on Mac IE which never did download > properly, IIRC. -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php