On Mon, Aug 17, 2015 at 12:54 PM, Jos van den Oever
<j...@vandenoever.info> wrote:
> There is a (non-standard) instruction for robots.txt which reduces the crawl-
> frequency.
> E.g. "Crawl-delay: 10" says only 10 requests per second are allowed.
> Neither projects.kde.org nor quickgit.kde.org are using this atm.
>
>  
> http://stackoverflow.com/questions/17377835/robots-txt-what-is-the-proper-format-for-a-crawl-delay-for-multiple-user-agent
>
> If we do not let search engines index our primary product (source code), then
> it's not strange that people cannot find it.

There is no point in allowing to index whole source codes IMO. But as
sandsmark mentioned above there should be one page of where to find
source code, how to get it, and how to contribute further and that
should be indexed.

-- 
Bhushan Shah

http://bhush9.github.io
IRC Nick : bshah on Freenode
_______________________________________________
kde-community mailing list
kde-community@kde.org
https://mail.kde.org/mailman/listinfo/kde-community

Reply via email to