Re: [PHP] Emergency! Performance downloading big files

2009-12-02 Thread דניאל דנון
Try using CURL - with that you can download many links simultaneously! On Wed, Dec 2, 2009 at 12:48 AM, Brian Dunning wrote: > This is a holiday-crunch emergency. > > I'm dealing with a client from whom we need to download many large PDF docs > 24x7, several thousand per hour, all between a few h

Re: [PHP] Emergency! Performance downloading big files

2009-12-02 Thread Kim Madsen
Brian Dunning wrote on 2009-12-01 23:48: This is a holiday-crunch emergency. I'm dealing with a client from whom we need to download many large PDF docs 24x7, several thousand per hour, all between a few hundred K and about 50 MB. Their security process requires the files to be downloaded via

Re: [PHP] Emergency! Performance downloading big files

2009-12-01 Thread Nathan Nobbe
On Tue, Dec 1, 2009 at 4:56 PM, LinuxManMikeC wrote: > On Tue, Dec 1, 2009 at 3:48 PM, Brian Dunning > wrote: > > > > This is a holiday-crunch emergency. > > > > I'm dealing with a client from whom we need to download many large PDF > docs 24x7, several thousand per hour, all between a few hundre

Re: [PHP] Emergency! Performance downloading big files

2009-12-01 Thread Brian Dunning
Can someone explain how this would work? It's a Windows web server running IIS and the files are saved to a drive that is outside the web root. PHP is grabbing each filename from a MySQL database, along with the URL and credentials for it, and ends up with a url something like this: https://serv

Re: [PHP] Emergency! Performance downloading big files

2009-12-01 Thread LinuxManMikeC
On Tue, Dec 1, 2009 at 3:48 PM, Brian Dunning wrote: > > This is a holiday-crunch emergency. > > I'm dealing with a client from whom we need to download many large PDF docs > 24x7, several thousand per hour, all between a few hundred K and about 50 MB. > Their security process requires the files

Re: [PHP] Emergency! Performance downloading big files

2009-12-01 Thread Michael Shadle
On Tue, Dec 1, 2009 at 3:21 PM, James McLean wrote: > The suggestion from other users of off-loading the PDF downloading to > Apache (or another webserver) is a good idea also. ^ I never allow PHP to be [ab]used and kept open to spoonfeed clients with fopen/readfile/etc. in nginx: header("X-A

Re: [PHP] Emergency! Performance downloading big files

2009-12-01 Thread James McLean
On Wed, Dec 2, 2009 at 9:18 AM, Brian Dunning wrote: > This is a holiday-crunch emergency. Aren't they all! :) > It's WAY TOO SLOW. I can paste the URL into a browser and download even the > largest files quite quickly, but the PHP method bottlenecks and cannot keep > up. Are you certain you

Re: [PHP] Emergency! Performance downloading big files

2009-12-01 Thread Mari Masuda
On Dec 1, 2009, at 2:48 PM, Brian Dunning wrote: > This is a holiday-crunch emergency. [snip] > Is there a SUBSTANTIALLY faster way to download and save these files? Keep in > mind the client's requirements cannot be changed. Thanks for any suggestions. Could you just put the URLs of the files

Re: [PHP] Emergency! Performance downloading big files

2009-12-01 Thread Ashley Sheridan
On Tue, 2009-12-01 at 14:51 -0800, Brian Dunning wrote: > Oops, it's several hundred per hour, several thousand per day. Sorry for the > accidental superlative. > > > I'm dealing with a client from whom we need to download many large PDF docs > > 24x7, several thousand per hour, all between a f

Re: [PHP] Emergency! Performance downloading big files

2009-12-01 Thread Brian Dunning
Oops, it's several hundred per hour, several thousand per day. Sorry for the accidental superlative. > I'm dealing with a client from whom we need to download many large PDF docs > 24x7, several thousand per hour, all between a few hundred K and about 50 MB. -- PHP General Mailing List (http