At Fri, 9 Mar 2001 12:50:17 -0500, mallum <[EMAIL PROTECTED]> wrote:

[RSS feeds]

> I run http://10.am and do this on a largish scale.
> 
> For aggregating RSS feeds I use RSSLite [1] rather than XML::RSS. 
> RSSLite avoids using expat and is a little naughty in parsing XML 
> that would make expat barf ( Alot of RSS feeds unfortunatly contain 
> bad XML ).

Is it just me or does this this idea fill anyone else with dread?

I hope we all agree that it all started to go wrong for the web when
browsers started to deal with invalid HTML. This lead inevitably to
the situation we have now where 95%[1] of web sites are made up of
invalid HTML. XML was supposed to solve these problems by being far 
stricter on non-wellformed documents - this is why Expat just barfs on 
input that isn't well-formed. If we start "going soft" on such 
documents, surely we run the risk of ending up in exactly the same 
mess that we have with HTML now.

If someone sends you invalid XML in an RSS file, I'd recommend hitting
them over the head with a clue by four until they fix it. Another nice
touch might be some text on your web site saying "We hoped to have an
RSS feed from [web site of choice] but they ar incapable of providing
a valid XML file".

Dave...

p.s. But then, bear in mind that I still think that browsers should 
respond to invalid XML by putting up a page that says "the author of
this page is a fuckwit" - so my views on the matter might not be 
exactly mainstream :)

[1] Remember kids, 83% of statistics are made up on the spot!

Reply via email to