On Wed, 2009-09-16 at 11:17 +0200, Andrea Giammarchi wrote:

> > The way I've always seen this approached before is by using the wget
> > command, which can be asked to just return the headers for a page. In
> > your case you'd be looking for all 200 codes, which means that all the
> > sites are up. This is faster than asking to return a full image each
> > time.
> but slower 'cause for each image you need to requests
> On the other hand, file_get_contents could return false positives cause the 
> fact we are asking for an image does not mean an image will be returned.
> I would go for a curl call, where you can have both headers and content so in 
> one call you can handle every case. A bit slower than a HEAD request, surely 
> faster than a HEAD request plus the REQUEST.
> One more thing, I hope you have rights to grab these images, usually there 
> are APIs or webservices when a website would like to share images in this way 
> but it does not seem the case here ...
> Regards
> _________________________________________________________________
> Drag n’ drop—Get easy photo sharing with Windows Live™ Photos.
> http://www.microsoft.com/windows/windowslive/products/photos.aspx

Requesting only the headers is a lot faster than requesting the headers
AND the file itself. I'd also not look to grab an image anyway, try
grabbing just the HTML of a web-page. You get the headers, and the HTML
is likely to be very small in size. Not only that, you can perform other
tests on the returned HTML, for example to see if PHP is still running
on the remote site. All of this is very easy to accomplish with a single
line call to wget.


Reply via email to