Re: [Dspace-tech] Google crawler repeatedly requests non-existent handles...

2010-09-23 Thread Vinit
Dear Panyarak, Check ur sitemaps file. Find out the non existing pages and delete the URLs. Plus in robots.txt file you have to give the location of the sitemaps.xml file. Regards Vinit Kumar Senior Research Fellow Documentation Research and Training Centre Bangalore MLISc (BHU) Varanasi,

Re: [Dspace-tech] Google crawler repeatedly requests non-existent handles...

2010-09-23 Thread Panyarak Ngamsritragul
Thanks Vinit for the information. I checked the sitemap files under DSPACE/sitemaps and found that the handles Google crawler keep on accessing do not exist in any of the files there. Or there are sitemap files elsewhere? How do I include the sitemap files in robot.txt? Sorry if this is a

[Dspace-tech] Google crawler repeatedly requests non-existent handles...

2010-09-20 Thread Panyarak Ngamsritragul
Hi, There are 2 points here: 1. In our repository, we have configured to allow crawler to browser our site by putting a robot.txt with only one line : User-agent: * I have checked with webmaster tools and it reports that the crawler access was success. Anyway, I am not quite sure that should