Hi Bruno, I think you should keep the robots in case your rule is modified by error.
Also the redirect uses wrong URL leading to 404, there is not docs folder for changes.html And it does not redirect if you hit http://home.apache.org/~milamber/jmeter Thanks On Thu, Oct 3, 2019 at 1:55 PM Milamber <[email protected]> wrote: > > Oups... so I removed the robots.txt files and added a .htaccess file for > redirect to https://jmeter.apache.org/ > > > > On 03/10/2019 09:45, Vladimir Sitnikov wrote: > > Milamber>I just upload a robots.txt file with disallow all into each RC > > into my > > Milamber>public html folder on home.apache.org. > > > > Can you please clarify? > > > > https://www.robotstxt.org/robotstxt.html says the file should be placed > at > > the root directory. > > > > Then Google's documentation says: > > > > https://support.google.com/webmasters/answer/6062608?hl=en > > Google>A robotted page can still be indexed if linked to from from other > > sites > > Google>While Google won't crawl or index the content blocked by > robots.txt, > > we might still find and index a disallowed URL if it is linked from other > > places on the web > > > > Which means "Google would discover the link to the preview from a mailing > > list archive", so the only feasible option is to remove the page > completely > > (or move it to a new place that is never mentioned in the mailing lists" > > > > Vladimir > > > > -- Cordialement. Philippe Mouawad.
