"Rusty Small" <[EMAIL PROTECTED]> wrote in message
> Thanks for the help.  I made the assumption that all data from each page
> being downloaded but it is not.
> I do have one other question.  I've used microtime() to test how long it
> takes to download each page and have noticed that some pages may take as
> long as 45 seconds to download!!!  I know that this lag is out of my
> as it is depends upon the status of the remote server.  However, I would
> like to tell the script that if "x" amount of time has passed and the
> fopen() function is not complete then stop and move on.  I've tried using
> while and if...else statements using microtime() as the timeout limit but
> didn't have any success.  What kind of flow control mechanism should I use
> for this??

  I have never done this give this is shot, try it it should work.

you can use passthru()/system() to call wget (you can specify the timeout in
wget) (wget will let u know the success or failure) and donwload the file in
a temporary folder. then you can use fopen to open the file.



PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to