I'm sorry if I've missed something in this thread. I have had the same problem before with a few different spiders. I like it happening though because it allows me to find something wrong that would get missed if it is rarely or ever browsed by an actual user.
When an error would be generated I would go hunt it down and correct it. On problems like a URL change or URL var change or something no longer existing, I just had it catch the error and return nothing. However I like the idea of returning an http status error. That way it may make it update their index. -----Original Message----- From: Matt Robertson [mailto:[EMAIL PROTECTED] Sent: Wednesday, February 23, 2005 12:03 PM To: CF-Talk Subject: Re: Sloppy - Yahoo! Slurp throwing CFerrors If you have an error handler, then at the top of it test http_user_agent and, if it turns out to be Slurp, feed them something that makes them go away, like as Dave suggested a 200 error rather than showing them whatever your friendly error is. I've got the same problem and think I'll try this as soon as possible. Googlebot drives a client of mine nuts and this may be his ticket as well. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~| Find out how CFTicket can increase your company's customer support efficiency by 100% http://www.houseoffusion.com/banners/view.cfm?bannerid=49 Message: http://www.houseoffusion.com/lists.cfm/link=i:4:196133 Archives: http://www.houseoffusion.com/cf_lists/threads.cfm/4 Subscription: http://www.houseoffusion.com/lists.cfm/link=s:4 Unsubscribe: http://www.houseoffusion.com/cf_lists/unsubscribe.cfm?user=89.70.4 Donations & Support: http://www.houseoffusion.com/tiny.cfm/54

