The problem is the Server at the University: "The homepages.uc.edu webserver has a robots.txt file which denies access to all search engines. The W3 Checklink utility is honoring the robots.txt. There is no workaround for this. You will need to use another utility."
-- Charles -- Volunteer Associate Professor of Psychiatry University of Cincinnati Medical Center Website: http://TinyURL.com/EnzerMD ________________________________________ From: [email protected] [[email protected]] On Behalf Of Grigore Dolghin [[email protected]] Sent: Tuesday, April 13, 2010 1:08 AM To: [email protected] Subject: Re: [NF] Error: 403 Forbidden by robots.txt How about removing that file for a while and trying again? :) On Tue, Apr 13, 2010 at 7:36 AM, Enzer, Charles (enzerch) < [email protected]> wrote: > I want to check the validity of all of my links on my website. > > I used: > > http://validator.w3.org/checklink > > And got this error message; > > Error: 403 Forbidden by robots.txt > > What do you suggest for an online Link Checker? > > Thank you. > > Charles > > -- Charles Hart Enzer, M.D., FAACAP -- > Volunteer Associate Professor of Psychiatry > University of Cincinnati Medical Center > Website: http://TinyURL.com/EnzerMD [excessive quoting removed by server] _______________________________________________ Post Messages to: [email protected] Subscription Maintenance: http://leafe.com/mailman/listinfo/profox OT-free version of this list: http://leafe.com/mailman/listinfo/profoxtech Searchable Archive: http://leafe.com/archives/search/profox This message: http://leafe.com/archives/byMID/profox/[email protected] ** All postings, unless explicitly stated otherwise, are the opinions of the author, and do not constitute legal or medical advice. This statement is added to the messages for those lawyers who are too stupid to see the obvious.

