Thanks for the help.  I made the assumption that all data from each page was
being downloaded but it is not.

I do have one other question.  I've used microtime() to test how long it
takes to download each page and have noticed that some pages may take as
long as 45 seconds to download!!!  I know that this lag is out of my control
as it is depends upon the status of the remote server.  However, I would
like to tell the script that if "x" amount of time has passed and the
fopen() function is not complete then stop and move on.  I've tried using
while and if...else statements using microtime() as the timeout limit but I
didn't have any success.  What kind of flow control mechanism should I use
for this??

RS
"Rusty Small" <[EMAIL PROTECTED]> wrote in message
[EMAIL PROTECTED]">news:[EMAIL PROTECTED]...
> I have written a script in which the main purpose is to download html
pages
> from multiple web servers.  I'm using the fopen() function to download
these
> pages.  I would like to be able to download only the source (text) and no
> binary data as this would greatly improve the speed of my script.
>
> I've seen this on the client side with browsers being set to text only
mode.
> Is there a way to do this with php on the sever side to tell the remote
web
> server to not download the images associated a particular URL?  I'm
running
> Red Hat 7.3 and Apache web server.
>
> Any help would be greatly appreciated.
>
> Cliff
>
>



-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to