It was mentioned on the release notes a while back. I agree that the
divergence between source installs (what the manual usually talks about)
and the packages has become a real problem.
El jue., 4 may. 2017 a las 11:49, Michael Kuhn ()
escribió:
> Hi Magnus
>
> >> The
Hi Magnus
The sitemapper tool is baked in Koha. The packages have a handy
koha-sitemap script.
And the documentation for it is available if you do this on the command line:
$ man koha-sitemap
Yes - but before, of course, the world needs to know there IS such a
command. That's why I wrote
On 3 May 2017 at 20:52, Tomas Cohen Arazi wrote:
> The sitemapper tool is baked in Koha. The packages have a handy
> koha-sitemap script.
And the documentation for it is available if you do this on the command line:
$ man koha-sitemap
Best regards,
Magnus
Libriotech
The sitemapper tool is baked in Koha. The packages have a handy
koha-sitemap script.
Regards.
El mié., 3 may. 2017 a las 11:45, Michael Kuhn ()
escribió:
> Hi Mark
>
> >> # ufw status
> >> Status: active
> >>
> >> To Action From
> >> --
Hi Mark
# ufw status
Status: active
To Action From
-- --
22/tcp ALLOW Anywhere
80/tcp ALLOW Anywhere
8080/tcp ALLOW Anywhere
Anywhere
Hi Hugo
You 're not the only one who has suffered this from Google, but Baidu is
worse and some others as well, giving you telegram answers to your points...
Yes I have also suffered a lot from crawlers, and I have spend a lot of
hours trying to adjut firewalls, robots
What version of
Excerpts from Michael Kuhn's message of 2017-05-03 16:14:55 +0200:
> # ufw status
> Status: active
>
> To Action From
> -- --
> 22/tcp ALLOW Anywhere
> 80/tcp ALLOW
Hi
You 're not the only one who has suffered this from Google, but Baidu is
worse and some others as well, giving you telegram answers to your points...
Yes I have also suffered a lot from crawlers, and I have spend a lot of
hours trying to adjut firewalls, robots
What version of Koha
Hi Mark and Hugo
Many thanks for your hints! I have now done the following.
1. I created a file "/usr/share/koha/opac/htdocs/robots.txt" containing
this:
Sitemap: sitemapindex.xml
User-agent: *
Disallow: /cgi-bin/
2. I generated a Koha sitemap using the seemingly undocumented Perl
Hi
Yes this is annoying issue with boots, this is google but there are plenty
of them...
You should use robots.txt propertly, but If I am not wrong with Google it
is more effective go to google webmaster web and modify the googleboot
behaviour with your koha installarion
You should also use a
> When I searched for who is 66.249.64.32 I saw this IP addresse belongs
> to Google.
This does seem to be the Google indexer:
% nslookup 66.249.64.32
...
32.64.249.66.in-addr.arpa name = crawl-66-249-64-32.googlebot.com.
I haven't seen this problem (yet), but perhaps that is because
11 matches
Mail list logo