On Wed, May 8, 2013 at 7:20 PM, Waynn Lue <[email protected]> wrote: > I'm getting exceptions from my server when a bot starts crawling random > incorrect un-escaped URLs (up to a few thousand now), specifically with % in > the URL. I've looked at a bunch of issues in rack here and here and rails > here, and looked at this SO thread but there doesn't seem to be a definitive > solution. Is there a correct solution for GET errors? Do I have to > monkeypatch rack?
Do you want to process these requests? If they are coming from a rogue bot, you may just want to catch them and drop them. Make sure you also have robots.txt in your public/ folder. As for the underlying handling of parsing such strings by rails, your guess is as good as mine... -- You received this message because you are subscribed to the Google Groups "Ruby on Rails: Talk" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. For more options, visit https://groups.google.com/groups/opt_out.

