[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2024-04-21 Thread bugzilla-daemon--- via Koha-bugs
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 Michael changed: What|Removed |Added CC||michael.r.ge...@gmail.com ---

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2024-03-20 Thread bugzilla-daemon--- via Koha-bugs
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 --- Comment #14 from David Cook --- (In reply to David Cook from comment #13) > (In reply to Katrin Fischer from comment #4) > > Should we include a default/sample robots.txt with Koha? > > It is tempting to add a robots.txt file to

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2022-08-21 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 --- Comment #13 from David Cook --- (In reply to Katrin Fischer from comment #4) > Should we include a default/sample robots.txt with Koha? It is tempting to add a robots.txt file to the koha-common package. -- You are receiving

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2022-02-15 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 David Cook changed: What|Removed |Added CC||dc...@prosentient.com.au --

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2022-02-15 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 Patrick Robitaille changed: What|Removed |Added CC|

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2020-01-24 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 clodagh.ke...@educampus.ie changed: What|Removed |Added CC|

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2020-01-23 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 Barry Cannon changed: What|Removed |Added CC||b...@interleaf.ie ---

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2019-10-12 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 --- Comment #11 from Katrin Fischer --- (In reply to tecnicouncoma from comment #10) > Hi people. I tried robots.txt and nothing happend. I installed Koha in an i7 > computer with 16G RAM. > > We are rebooting the system twice a day

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2019-10-11 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 tecnicouncoma changed: What|Removed |Added CC||tecnicounc...@gmail.com ---

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2019-07-25 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 Teknia changed: What|Removed |Added CC||nicjdevr...@gmail.com --- Comment

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2016-12-23 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 --- Comment #8 from José Anjos --- I would suggest that robots.txt like the one in comment 3 should come with fresh installs. Firstly because most times people don't realize that the koha performance are affected

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2016-12-22 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 --- Comment #7 from Fred P --- I don't believe this is a Koha issue. Any public site can be "hit" by any user. Blocking Chinese search giant Baidu makes a big difference. Disallow their robots and you will get

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2016-12-22 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 José Anjos changed: What|Removed |Added CC|

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2016-12-01 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 Nicole C. Engard changed: What|Removed |Added CC|neng...@gmail.com | --

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2016-10-17 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 --- Comment #5 from Magnus Enger --- (In reply to Katrin Fischer from comment #4) > Should we include a default/sample robots.txt with Koha? There is a file called README.robots at the top of the project (with

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2016-10-16 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 Katrin Fischer changed: What|Removed |Added CC|

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2016-10-04 Thread bugzilla-daemon
https://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 Bob Birchall changed: What|Removed |Added Status|NEW |In

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2014-09-17 Thread bugzilla-daemon
http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 Pablo AB pablo.bian...@gmail.com changed: What|Removed |Added CC|

[Koha-bugs] [Bug 4042] Public OPAC search can fall prey to web crawlers

2012-02-13 Thread bugzilla-daemon
http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=4042 Fred P fred.pie...@smfpl.org changed: What|Removed |Added CC|