Hello guys,

Am wondering if once can rate-limit (reqs/second or /minute) globally, but also whitelist Google (and other crawlers), by hardcoding their reverse dns hosts in the config? If reverse dns/host check is no-go, can the same be achieved using user agent filter?

Any ideas are greatly appreciated.
Thx
Joe

Reply via email to