CVSROOT: /cvs Module name: ports Changes by: st...@cvs.openbsd.org 2025/03/28 05:39:57
Log message: import ports/www/anubis, ok tb Anubis acts as middleware between a reverse proxy and backend web server. It assesses whether a connection is likely to be from a scraper bot and, if this seems that there's a chance of this, it issues a SHA-256 proof- of-work challenge before allowing the connection to proceed. As of 1.14.x, Anubis decides to present a challenge using this logic: User-Agent contains "Mozilla" Request path is not in /.well-known, /robots.txt, or /favicon.ico Request path is not obviously an RSS feed (ends with .rss, .xml, or .atom) This should ensure that git clients, RSS readers, and other low-harm clients can get through without issue, but high-risk clients such as browsers and AI scraper bots impersonating browsers will get blocked. When a challenge is passed, a signed JSON Web Token (JWT) is provided as a cookie, allowing future requests to pass without triggering the challenge. Using Anubis will likely result in your website not being indexed by some search engines. This is considered a feature, not a bug. Status: Vendor Tag: sthen Release Tags: sthen_20250328 N ports/www/anubis/Makefile N ports/www/anubis/distinfo N ports/www/anubis/modules.inc N ports/www/anubis/pkg/DESCR N ports/www/anubis/pkg/PLIST N ports/www/anubis/pkg/README N ports/www/anubis/pkg/anubis.rc No conflicts created by this import