I just watched the excellent presentation by Gerrit Grunwald "Use the force
Luke" on Parleys and in it he mentions that one of the key ways to improve
performance is to limit the number of nodes in the scenegraph.  He also
mentions that on such devices as the Raspberry Pi the maximum number of
nodes viable before performance degrades significantly is very, very
limited.  Further, he then goes on to demonstrate that the equivalent
visual appearance can be achieved by other means such as CSS, Canvas etc.
where the number of nodes is much less.

The implication here is that there is a performance-limiting effect of
Nodes.  If the device's GPU is capable of rendering certain graphics
primitives, effects, transitions etc. and JavaFX is capable of "making them
happen" by one way or another, I am curious as to why the simple presence
of Nodes limits performance so significantly.

The obvious conclusion is that Nodes use memory and perhaps the associated
overhead is the cause but given that we are largely talking about GPU based
processing I find it hard to believe that it's as simple as this.

So what is it about the nature of Nodes that causes them to have such a
limiting effect on performance?



Reply via email to