Arjen van der Meijden wrote:

Here is a graph of our performance measured on PostgreSQL:


The "perfect" line is based on the "Max" value for 1 core and then just multiplied by the amount of cores to have a linear reference. The "Bij 50" and the "perfect" line don't differ too much in color, but the top-one is the "perfect" line.

Sureky the 'perfect' line ought to be linear?  If the performance was perfectly linear, then the 'pages generated' ought to be G times the number (virtual) processors, where G is the gradient of the graph.  In such a case the graph will go through the origin (o,o), but you graph does not show this. 

I'm a bit confused, what is the 'perfect' supposed to be?



Reply via email to