> The way I've always seen this approached before is by using the wget
> command, which can be asked to just return the headers for a page. In
> your case you'd be looking for all 200 codes, which means that all the
> sites are up. This is faster than asking to return a full image each
but slower 'cause for each image you need to requests
On the other hand, file_get_contents could return false positives cause the
fact we are asking for an image does not mean an image will be returned.
I would go for a curl call, where you can have both headers and content so in
one call you can handle every case. A bit slower than a HEAD request, surely
faster than a HEAD request plus the REQUEST.
One more thing, I hope you have rights to grab these images, usually there are
APIs or webservices when a website would like to share images in this way but
it does not seem the case here ...
Drag n’ drop—Get easy photo sharing with Windows Live™ Photos.