On 04/06/2023 05:17, Bret Busby wrote:
On 4/6/23 14:32, Max Nikulin wrote:
I believe, web site creators should be blamed more aggressively than
browser developers for RAM requirements of contemporary web applications.
That was the point that I was making - I had not, as a twisted response
indicated, criticised Firefox regarding the misuse of resources - I
explicitly referred to malignant web application developers (for those
that do not understand the term, a web application is the application,
on the web application hosting server, that the user accesses, using a
web browser, not the web browser itself) that steal users' resources
using client-side processing (by using malware such as javascript using
client side processing), rather than properly and ethically using
server-side processing, such as .jsp or Perl .cgi applications.
The problem is that some web developers (and, especially, their
employers) offload the processing that should be done on the business
web application hosting server, to the victim users' personal computers.
It is a malignant exploitation, like the "gig economy".
With no client-side javascript, it's not possible to change just a part
of a web page[0]. The server must send the whole web page to be rendered
by the client. So while it decreases CPU usage in the client, it
increases network usage. Isn't it unethical to also "steal" more
bandwidth than necessary?
[0] There are frames (now deprecated) and iframes, but they only get you
so far. And each (i)frame must be a complete html page.
And even with regards to CPU usage your model might not be so great.
Instead of re-rendering just the part of the page that needs to be
changed (say, the message pane in a webmail application), with no
client-side scripting the whole interface must be re-rendered, which can
be resource intensive. So while I'd agree that with client-side
scripting resource usage in the client is higher, it might not be as
higher as you think.
--
Eduardo M KALINOWSKI
edua...@kalinowski.com.br