Michael Cordover's comments were the correct answer. :)
Here is an excerpt from an Interview with Matt Cutts, Google engineer, just last month:
Q: "In more general terms, what do you think is the relationship between Google and the W3C? Do you think it would be important for Google to e.g. be concerned about valid HTML?
A: I like the W3C a lot; if they didn't exist, someone would have to invent them. :) People sometimes ask whether Google should boost (or penalize) for valid (or invalid) HTML. There are plenty of clean, perfectly validating sites, but also lots of good information on sloppy, hand-coded pages that don't validate. Google's home page doesn't validate and that's mostly by design to save precious bytes. Will the world end because Google doesn't put quotes around color attributes? No, and it makes the page load faster. :) Eric Brewer wrote a page while at Inktomi that claimed 40% of HTML pages had syntax errors. We can't throw out 40% of the web on the principle that sites should validate; we have to take the web as it is and try to make it useful to searchers, so Google's index parsing is pretty forgiving."
Here is an excerpt from an Interview with Matt Cutts, Google engineer, just last month:
Q: "In more general terms, what do you think is the relationship between Google and the W3C? Do you think it would be important for Google to e.g. be concerned about valid HTML?
A: I like the W3C a lot; if they didn't exist, someone would have to invent them. :) People sometimes ask whether Google should boost (or penalize) for valid (or invalid) HTML. There are plenty of clean, perfectly validating sites, but also lots of good information on sloppy, hand-coded pages that don't validate. Google's home page doesn't validate and that's mostly by design to save precious bytes. Will the world end because Google doesn't put quotes around color attributes? No, and it makes the page load faster. :) Eric Brewer wrote a page while at Inktomi that claimed 40% of HTML pages had syntax errors. We can't throw out 40% of the web on the principle that sites should validate; we have to take the web as it is and try to make it useful to searchers, so Google's index parsing is pretty forgiving."
http://blog.outer-court.com/archive/2005-11-17-n52.html
I suppose the real issue now is - can someone build the Google page so that it does work in all browsers; so that it validates; and so that the resultant code is 'ligher' and saves more bandwidth? After all - Google are saying there is a commercial benefit to their invalid codebase - the only way they'd consider a change - in my opinion - is for a greater commercial benefit.
I suppose the real issue now is - can someone build the Google page so that it does work in all browsers; so that it validates; and so that the resultant code is 'ligher' and saves more bandwidth? After all - Google are saying there is a commercial benefit to their invalid codebase - the only way they'd consider a change - in my opinion - is for a greater commercial benefit.
