On 2010/12/04 18:56 (GMT+0530) Chetan Crasta composed:

After reading your explanation I still don't think the huge amount of
non-semantic code is justified.  Sure you're site might work perfectly
in Internet Explorer 3 running on Windows 95 with a Pentium 200 Mhz
and a 14.4 kbps modem, but does anybody care? Why burden search engine
bots and normal users with cruft that shouldn't have got past the 90s?

So, you're saying every page should be rewritten every time technology changes? How often/by what yardsticks or landmark changes does one decide? Who's going to pay for the time required? Why do browsers maintain backward compatibility?

"Normal" users don't use most of Georg's such pages.

Bots are bots; they lie in beds of their own making.
--
"The wise are known for their understanding, and pleasant
words are persuasive." Proverbs 16:21 (New Living Translation)

 Team OS/2 ** Reg. Linux User #211409

Felix Miata  ***  http://fm.no-ip.com/
______________________________________________________________________
css-discuss [cs...@lists.css-discuss.org]
http://www.css-discuss.org/mailman/listinfo/css-d
List wiki/FAQ -- http://css-discuss.incutio.com/
List policies -- http://css-discuss.org/policies.html
Supported by evolt.org -- http://www.evolt.org/help_support_evolt/

Reply via email to