Joshua Schachter wrote:
> All kinds of subtlety here. For example, what to do if the site
> happens to be down while we check it? What about respecting robots.txt
> etc?  

I don't think there's too much subtlety involved.  Obviously (to me),
robots.txt needs to be respected.  Those links will simply not be
automatically checked for validity.

Sometimes the reason a link dies is because the site _has_ "gone away".
I understand that a temporary outage creates a false positive, but I
think this kind of tool would be very useful nonetheless.  

You'd probably need tools for administration of this (so it could be
unset, or set, manually) - but it could be done through a special tag,
too.  Make a rule that tags beginning in "special:" (or whatever) are
reserved for special use by the system.  Just flag the dead links with
"special:dead" and you're done.  (In this case the special use could be
addition of class="dead" to the LI tag for this entry.  A CSS rule could
dim it or something.)  Remove this tag to restore the link.  Thus no
extra admin features are needed, just tag management.

Tim
-- 
Tim Larson
West Corporation, Interactive TeleServices
Eschew obfuscation!


 
Yahoo! Groups Links

<*> To visit your group on the web, go to:
    http://groups.yahoo.com/group/ydn-delicious/

<*> To unsubscribe from this group, send an email to:
    [EMAIL PROTECTED]

<*> Your use of Yahoo! Groups is subject to:
    http://docs.yahoo.com/info/terms/
 


Reply via email to