On Mon, April 10, 2006 4:46 pm, darren kirby wrote:
> quoth the Robert Cummings:
>>
>> Why do you do this on every request? Why not have a cron job
>> retrieve an
>> update every 20 minutes or whatnot and stuff it into a database
>> table
>> for your page to access? Then if the cron fails to retrieve the feed
>> it
>> can just leave the table as is, and your visitors can happily view
>> slightly outdated feeds? Additionally this will be so much faster
>> that
>> your users might even hang around on your site :)
>
> This is a very interesting idea, but I am not sure if it is suitable
> for me at
> this point. First of all, one feed in particular can change in a
> matter of
> seconds, and I do want it to be as up to date as possible. Secondly,
> this is
> just for my personal site which is very low traffic, and it is not
> inconceivable that getting the feed every 20 minutes by cron would be
> _more_
> taxing on the network than simply grabbing it per request...

Perhaps, then, you should:
maintain a list of URLs and acceptable "age" of feed.
Attempt to snag the new content upon visit, if the content is "old"
Show the "old" content if the feed takes longer than X seconds.

You'll STILL need a timeout, which, unfortunately, means you are stuck
rolling your own solution with http://php.net/fsockopen because all
the "simple" solutions pretty much suck in terms of network timeout.
:-(

It would be REALLY NIFTY if fopen and friends which understand all
those protocols of HTTP FTP HTTPS and so on, allowed one to set a
timeout for URLs, but they don't and nobody with the skills to change
that (not me) seems even mildly interested. :-( :-( :-(

Since I've already written a class that does something like what you
want, or maybe even exactly what you want, I might as well just
provide the source, eh?

http://l-i-e.com/FeedMulti/FeedMulti.phps

-- 
Like Music?
http://l-i-e.com/artists.htm

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to