Quite true - in the script I use, I have this for the W3C validator:
if (stristr($_SERVER["HTTP_USER_AGENT"],"W3C_Validator")) {
   $mime = "application/xhtml+xml";
}

As to why, my own personal reasons are three-fold:
1. The W3C is clear that XHTML 1.1 should not (different than must not, 
I'm aware) be sent as text/html - http://www.w3.org/TR/xhtml-media-
types/ (I think that's the right link - I'm using a PDA and browsing is 
a pain). 
2. Conforming UAs *should* refuse to render invalidly marked up pages if 
sent with the proper MIME type - that saves me a lot of time in 
development.
3. When dealing with MathML or XHTML Ruby at all - your documents have 
to be sent as XHTML (application/xhtml+xml, application/xml or text/xml) 
- not HTML, unless of course you want to stick it into an <object> tag. 
I do believe that's all for now - however I'm not at work and my mind is 
in a different place, so hopefully this came out coherently enough. :)
Cheers,

> On a related note, since the W3C's validator doesn't send an 
> "HTTP_ACCEPT" header, you should also look at the "HTTP_USER_AGENT" 
> header as well. While I normally would advise against browser 
sniffing, 
> I make exceptions for the W3C Validator, the W3C CSS Validator, and 
the 
> WDG Validator.
> 
> 
> 
> 
> ******************************************************
> The discussion list for  http://webstandardsgroup.org/
> 
>  See http://webstandardsgroup.org/mail/guidelines.cfm
>  for some hints on posting to the list & getting help
> ******************************************************
> 
> 
> 

-- 

******************************************************
The discussion list for  http://webstandardsgroup.org/

 See http://webstandardsgroup.org/mail/guidelines.cfm
 for some hints on posting to the list & getting help
******************************************************

Reply via email to