Yes, these are hard problems. Even harder the problem is, when the
result is not invalid, but simply the connection to the rss fails....

Anyway, the way I solved it, was reading the RSS feeds from filesystem.
This always works. 

Do a cron job, which fetches your RSS for every 5 minutes for example,
and load them with flow, and IF flow can parse the xml, then write the
feed to disk. In that way, you have 2 major advantages:

Your rss feeds that you need are on your own filesystem, so they are
always valid xml (because invalid isnt stored) and the second major
advantage, is that it will outrun you solution by factors 100 in speed.
Caching is now very well possible, etc etc

I could send you the flow which does the stuff writing to disk when
valid, then you need the cron job to execute a pipeline every x minutes

AS

> 
> Hi!
> 
> I use the portal engine to aggregate some RSS feeds. If some newsfeed
> returns invalid XML, the whole portal isn't running anymore.
> 
> Some newsfeed returned incorrectly some HTML 404 page containing a
> PUBLIC DOCTYPE without DTD reference. This breaks up 
> everything. I tried
> to catch this via <map:handle-errors> but this doesn't work in this
> case.
> 
> Regards,
>   Alex
> 
> -- 
> Alexander Nofftz, Leverkusen, Germany, EU, Terra, Solar System
> [EMAIL PROTECTED] --- http://www.AlexNofftz.de/
> Jabber: [EMAIL PROTECTED] (Jabber?! 
> http://de.wikipedia.org/wiki/Jabber)
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]