It's not quite the same thing, but I worked on a project a couple of years ago 
integrating references/citations into a learning environment (called Telstar 
http://www8.open.ac.uk/telstar/) , and looked at the question of how to deal 
with broken links from references.

We proposed a more reactive mechanism than running link checking software. This 
clearly has some disadvantages, but I think a major advantage is the targetting 
of staff time towards those links that are being used. The mechanism proposed 
was to add a level of redirection, with an intermediary script checking the 
availability of the destination URL before either:

a) passing the user on to the destination
b) finding the destination URL unresponsive (e.g. 404), automatically reporting 
the issue to library staff, and directing the user to a page explaining that 
the resource was not currently responding and that library staff had been 
informed

Particularly we proposed putting the destination URL into the rft_id of an 
OpenURL to achieve this, but this was only because it allowed us to piggyback 
on existing infrastructure using a standard approach - you could do the same 
with a simple script, with the destination URL as a parameter (if you are 
really interested, we created a new Source parser in SFX to do (a) and (b) ). 
Because we didn't necessarily have control over the URL in the reference, we 
also built a table that allowed us to map broken URLs being used in the 
learning environment to alternative URLs so we could offer a temporary redirect 
while we worked with the relevant staff to get corrections made to the 
reference link.

There's some more on this at 
http://www.open.ac.uk/blogs/telstar/remit-toc/remit-the-open-university-approach/remit-providing-links-to-resources-from-references/6-8-3-telstar-approach/
 although for some reason (my fault) this doesn't include a write up of the 
link checking process/code we created.

Of course, this approach is in no way incompatible with regular proactive link 
checking.

Owen

Owen Stephens
Owen Stephens Consulting
Web: http://www.ostephens.com
Email: [email protected]
Telephone: 0121 288 6936

On 23 Feb 2012, at 17:02, Tod Olson wrote:

> There's been some recent discussion at our site about revi(s|v)ing URL 
> checking in our catalog, and I was wondering if other sites have any 
> strategies that they have found to be effective.
> 
> We used to run some home-grown link checking software. It fit nicely into a 
> shell pipeline, so it was easy to filter out sites that didn't want to be 
> link checked. But still the reports had too many spurious errors. And with 
> over a million links in the catalog, there are some issues of scale, both for 
> checking the links and consuming any report.
> 
> Anyhow, if you have some system you use as part of catalog link maintenance, 
> or if there's some link checking software that you've had good experiences 
> with, or if there's some related experience you'd like to share, I'd like to 
> hear about it.
> 
> Thanks,
> 
> -Tod
> 
> 
> Tod Olson <[email protected]>
> Systems Librarian     
> University of Chicago Library

Reply via email to