Just replying to my original post as means of a follow up. I have gone ahead and used Rasmus' code, he was correct in surmising I was only worried about an overloaded and slow responding site. It would be very atypical for the site to be down completely.
While the cache code was very interesting, I think it is a bit overkill
considering the low amount of traffic my site receives. What I have done
instead is to write the feed to a local file upon every update. I think it is
more elegant to have an old feed than to have the terse "Feed unavailable" in
the spot where the headlines should be.
I have tested by setting the timeout to '0' rather than yanking my ethernet
cable and it seems to work well. I just don't have a suitable spare box to
try the ethernet method, and setting the timeout to 0 seems to have the
intended effect anyway.
Again, I just want to thank everybody involved for their time and effort, it
is very much appreciated.
Here is my function now, if anybody is interested:
function getFeed($url) {
$cache_version = $_SERVER['DOCUMENT_ROOT'] . "/cache/" . basename($url);
$rfd = fopen($url, 'r');
stream_set_blocking($rfd,true);
stream_set_timeout($rfd, 5); // 5-second timeout
$data = stream_get_contents($rfd);
$status = stream_get_meta_data($rfd);
fclose($rfd);
if ($status['timed_out']) {
$xml = simplexml_load_file($cache_version);
}
else {
$lfd = fopen($cache_version, 'w');
fwrite($lfd, $data);
fclose($lfd);
$xml = simplexml_load_string($data);
}
print "<ul>\n";
foreach ($xml->channel->item as $item) {
$cleaned = str_replace("&", "&", $item->link);
print "<li><a href='$cleaned'>$item->title</a></li>\n";
}
print "</ul>\n";
}
-d
--
darren kirby :: Part of the problem since 1976 :: http://badcomputer.org
"...the number of UNIX installations has grown to 10, with more expected..."
- Dennis Ritchie and Ken Thompson, June 1972
pgpSQRnWoXs98.pgp
Description: PGP signature

