Link rot *is* a problem which, season by season, affects more and more web users, authors and admins. Also, they say, search engines do not like broken links, and this might adversely affect sites luck in Google (& Co.) ranking. Although there seems not to be a simple solution to the issue, we can (please read: "most probably should") identify some strategies in order to enhance people's (and our) experience/life on the internet. Next to a bunch of behaviours and best practices for minimizing link rot, a key role can surely be played by robots spidering our sites and reporting problems, thus underlining evidence of occasional broken links.
I installed a script I found, which apparently does the job[1]. You can find it for testing at http://www.kirpi.it/no404s/index.php It runs rather slow (maybe it's my fault, as I know nothing on setting up robots and optimizing server load) but if you try with moderately light page such as http://www.kirpi.it/wiki/pmwiki.php?n=Musica.Musica you get a report in some 20 seconds. Does anybody here has any experience with link rot scripts? Any idea of how to (if possible) maybe integrate them with pmwiki? As I pay flat rate, why shouldn't my server scan for dead links every now and then, as I sleep at night? And why shouldn't it, in case of a dead link, test on the waybackmachine and possibly suggest me to automagically swap/alter the link? Luigi ---- [1] http://scripts.ishallnotcare.org/no404s/ _______________________________________________ pmwiki-users mailing list [email protected] http://www.pmichaud.com/mailman/listinfo/pmwiki-users
