Hi Developers,
another thing in the discussion to be more polite.
I suggest that we log a message in case an requested URL was blocked by a robots.txt. Optimal would be if we only log this message in case the current used agent name is only blocked and it is not a general blocking of all agents.

Should I create a patch?

Stefan

Reply via email to