On the topic of speed, the current limitation is not imposed by the fact that it is run inside the browser, fyi some really popular and successful apps are web based, even visual studio code. It's more of an architectural and design problem than a platform issue. I'm not very familiar with the code yet, but I suspect the following could be factors,
- a lack of caching on both the client and the server, - excessive amounts of DOM nodes (currently around 4000 at idle), - excessive nesting using tables, - lack of a view reconciliation library such as React/Angular - usage of AMD modules making dead-code elimination (tree-shaking) impossible, if asynchronous loading was the intention, webpack supports dynamic imports with ESModules while retaining the benefits of tree-shaking. - lack of skeleton uis that are perceived as being faster - most top sites do this now making them look faster even if the time to interactive is the same - waiting on ajax requests before loading UIs, (theory) (the query tool takes 2-3s to open, is this why?) There's more to discover, I'm still reading through. Modern UI design philosophy is to react immediately to user response by showing something as close to the real interface as possible and then to load the data asynchronously behind a loading indicator, which is done in some situations but often not convincing enough or not immediate enough. (again, query tool, anywhere between 2-4s till first paint) Anywhere over one second and frustration sets in. 100ms should be the target to keep power users happy. More info: Performance: https://developers.google.com/web/fundamentals/performance/rail DOM Size: https://developers.google.com/web/tools/lighthouse/audits/dom-size The best way to identify areas that require improvement is to run the lighthouse <https://chrome.google.com/webstore/detail/lighthouse/blipmdconlkpinefehnmjammfjpmpbjk> auditing tool. On Tue, Jul 30, 2019 at 5:46 PM richard coleman <rcoleman.ascen...@gmail.com> wrote: > Dave, > > That's true. Of course that's the ultimate draw back of the client/server > model. Having *just* a server isn't enough, you need a *client* as > well. pgAdmin4 is *just* the server portion of that model. It's relying > on *borrowing* someone else's client. That's frees up quite a bit of > resources, not having to write an actual client. Unfortunately that means > you have to deal with all of the limitations; speed, footprint, > compatibility, security, capabilities that that entails. Contrary to the > thought in some circles, not *everything* is amenable to being served > through a web browser. > > just one man's opinion. > > On Tue, Jul 30, 2019 at 4:10 AM Dave Page <dp...@pgadmin.org> wrote: > >> >> >> On Mon, Jul 29, 2019 at 7:31 PM Mark Murawski < >> markm-li...@intellasoft.net> wrote: >> >>> >>> Would there be a possibility of embedding chromium? Since of course >>> it's actively developed and everyone including their pet cat are using >>> it as a rendering engine these days (including microsoft) Not sure of >>> the compatibility with the BSD license would go... >>> >> >> It's technically possible of course (one most, but not all of our >> supported platforms), but would be a massive amount of work, probably tying >> up most of my team for months whilst they figure out how to glue all the >> pieces together into Qt on Windows, Mac and Linux. I'd far rather they were >> building actual features. >> >> -- >> Dave Page >> Blog: http://pgsnake.blogspot.com >> Twitter: @pgsnake >> >> EnterpriseDB UK: http://www.enterprisedb.com >> The Enterprise PostgreSQL Company >> >