I have a small site that I run on a not-for-profit basis. Periodically I need to update robots.txt or add firewall rules to shut down bad actors who beat the crap out of the site running up my instance costs.
Lately, I've been getting slammed by instances running on AWS. They are mostly making HEAD requests, which makes me think it's some kind of crawler, but it uses regular browser user agents and doesn't respect my robots rules. There's no legitimate reason for AWS to browse my site, so I just add a firewall rule, right? Trouble is, AWS has 6,462 different IPV4 address ranges, and this crawler is constantly jumping between them. Any costs that I have are paid out of my own pocket. So I'm looking for suggestions that don't require MORE subscriptions (like CloudFlare or something). Any ideas? -Joshua -- You received this message because you are subscribed to the Google Groups "Google App Engine" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/google-appengine/1772968C-ED0C-49D9-BA35-AB7F2CA2AD1E%40gmail.com.
