Hi folks,

Our Evergreen environment has been experiencing a higher-than-usual volume of 
unwanted bot traffic in recent months. Much of this traffic looks like 
webcrawlers hitting Evergreen-specific URLs from an enormous number of 
different IP addresses. Judging from discussion in IRC last week, it sounds 
like other EG admins have been seeing the same thing. Does anyone have any 
recommendations for managing this traffic and mitigating its impact?

Some solutions that have been suggested/implemented so far:
- Geoblocking entire countries.
- Using Cloudflare's proxy service. There's some trickiness in getting this to 
work with Evergreen.
- Putting certain OPAC pages behind a captcha.
- Deploying publicly-available blocklists of "bad bot" IPs/useragents/etc. 
(good but limited, and not EG-specific).
- Teaching EG to identify and deal with bot traffic itself (but arguably this 
should happen before the traffic hits Evergreen).

My organization is currently evaluating CrowdSec as another possible solution. 
Any opinions on any of these approaches?
-- 
Jeff Davis
BC Libraries Cooperative
_______________________________________________
Evergreen-dev mailing list
Evergreen-dev@list.evergreen-ils.org
http://list.evergreen-ils.org/cgi-bin/mailman/listinfo/evergreen-dev

Reply via email to