rainer wrote: > > > simple > > It is, should be, at least logically, simple to invoke links > automatically and mark them depending on return-code. >
Challenge accepted 1. Log into your trac site. 2. Export your cookies for the trac site to e.g. cookies.txt. 3. wget -o log -r --load-cookies cookies.txt -R 'Wiki*,Trac*' -I yoursite/wiki http://your.domain/yoursite/ 4. grep -R 'class="[ a-z]*missing[ a-z]*wiki' . My Result: ./yourdomain/yoursite/index.html:<a class="missing wiki" href="/yoursite/wiki/test" rel="nofollow">test?</a> Note that you might end up with multiple missing links on the same line. You could always use perl instead of grep... About the commands above, in case further explanation is needed: wget -o log # write wget output to file log -r # recurse through the trac site (download all links) --load-cookies cookies.txt # load authentication cookies so Trac adds the "missing" class -R 'Wiki*,Trac*' # reject files starting with Wiki and Trac to cut down on the noise -I yoursite/wiki # only get files from the yoursite/wiki directory (no reports, tickets, etc.) http://your.domain/yoursite/ # that's your URL grep -R # recurse into directories 'class="[ a-z]*missing[ a-z]*wiki' # the text to find ('class="' followed by spaces or letters followed by "missing" followed spaces or letters followed by "wiki") . # the directory to start the search in (the current directory) --~--~---------~--~----~------------~-------~--~----~ You received this message because you are subscribed to the Google Groups "Trac Users" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/trac-users?hl=en -~----------~----~----~----~------~----~------~--~---
