On 07.09.2017 23:30, Daniel Veditz wrote:

​Without some kind of signal everyone gets the least-common-denominator version of a site

Exactly. That was the idea behind the web. Unfortunately, so many
things have been added in recent years that browsers became more
complex than operating systems and web coders take lots of dubious
assumptions (gladly, they meanwhile learned not to dictate screen
resolutions anymore :o), and the code of major sites becomes worse
day per day.

Just compare Facebook vs. VK: same functionality - FB is extremly
slow and fat, VK is very slim and fast. Ergo: folks like FB should
just do their homework (or become optimized-away someday).

(and even then older equipment or phones may result in a poor experience) or sites will try to guess based on user-agent.

Well, it's been a while since I was actively building web apps,
(when PCs were slower than today's cheap smartphones). I can't
recall any case where I had wished such an feature. My applications
also worked well even on the early smartphones (eg good old Nokia
communicator).

For images and things maybe CSS could specify some media queries that loads different resources based on local factors like amount of memory or network speed, but then that's leaking the same information just in a different way.

Not necessarily, and not in the same granularity. Of course it would be
better, if requests to css+co wouldn't send cookies.
(IMHO, cookies are a very bad invention to begin with)


--mtx
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to