Good Thought, Vincent.

However, W3C requires access to the Web Server:

   [Enter the address (URL) of a document that you would like to check:]

Charles

--  Charles Hart Enzer, M.D., FAACAP --
Volunteer Associate Professor of Psychiatry
University of Cincinnati Medical Center
Website: http://TinyURL.com/EnzerMD
________________________________________
From: [email protected] [[email protected]] On Behalf Of 
Vincent Teachout [[email protected]]
Sent: Tuesday, April 13, 2010 8:08 AM
To: [email protected]
Cc: [email protected]
Subject: Re: [NF] Error: 403 Forbidden by robots.txt

Enzer, Charles (enzerch) wrote:
> The problem is the Server at the University:
>
>    "The homepages.uc.edu webserver has a robots.txt file which denies access 
> to all search engines. The W3    Checklink utility is honoring the 
> robots.txt.  There is no workaround for this.  You will need to use another 
> utility."
>


Could you place copies on your local machine, and run the utility
against the local copies?
_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://leafe.com/mailman/listinfo/profox
OT-free version of this list: http://leafe.com/mailman/listinfo/profoxtech
Searchable Archive: http://leafe.com/archives/search/profox
This message: 
http://leafe.com/archives/byMID/profox/[email protected]
** All postings, unless explicitly stated otherwise, are the opinions of the 
author, and do not constitute legal or medical advice. This statement is added 
to the messages for those lawyers who are too stupid to see the obvious.

Reply via email to