https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042

--- Comment #7 from Fred P <fred.pie...@smfpl.org> ---
I don't believe this is a Koha issue. Any public site can be "hit" by any user.
Blocking Chinese search giant Baidu makes a big difference. Disallow their
robots and you will get a lot less hits. You can also block by ip address range
by editing your Apache .htaccess file. Keep in mind that you want to back that
file up before making changes and take precautions to not block your own
access!

In the .htaccess for the appropriate site directory, blocking range 180.76
would disable baidu search engines:

order allow,deny
#partial ip addresses blocking
deny from 180.76

Adding this to your root directory as a robots.txt file should warn off Yandex
and Baidu robots, however spiders change and respect for the robots.txt varies:

#Baiduspider
User-agent: Baiduspider
Disallow: /

#Yandex
User-agent: Yandex
Disallow: /

It looks like Chris' proposals were adopted. Does this bug need to remain open?

-- 
You are receiving this mail because:
You are the QA Contact for the bug.
You are watching all bug changes.
_______________________________________________
Koha-bugs mailing list
Koha-bugs@lists.koha-community.org
http://lists.koha-community.org/cgi-bin/mailman/listinfo/koha-bugs
website : http://www.koha-community.org/
git : http://git.koha-community.org/
bugs : http://bugs.koha-community.org/

Reply via email to