Picking this issue up once more.

I observe strong different performance between two comparable systems (in 
different facilities), which seems not plausible to me:

Visualization Enviroment 1:

VGL Server:
Hardware:
  CPU: 2*Intel Xeon E5-2687W v4, 12x3.0Ghz (3.5 Ghz max)
  MB : Supermicro Motherboard X10DAi
  RAM: 256GB DDR4 ECC 2400Hz (CT32G4RFD424A)
  GPU: Nvidia Quadro M6000 12GB
  SSD: 2TB Samsung PM863
  HDD: 24TB RAID 6 (5xHUH728080ALE600)
  LAN: 10Gbit Supermicro AOC-STG-I2T

Software:
Ubuntu 12.04 LTS
VirtualGL 2.6.1

VGLClient (Desktop PC):
CPU: Intel(R) Core(TM) i7-2600 CPU @ 3.40GHz
LAN: 1Gbit
Software: Ubuntu 16.04LTS

vglrun +pr /opt/VirtualGL/bin/glxspheres64

Readback    - 1279.52 Mpixels/sec-  936.68 fps
625.544299 frames/sec - 854.508525 Mpixels/sec
Compress 0  -   91.66 Mpixels/sec-   67.10 fps
Total       -  112.60 Mpixels/sec-   82.43 fps-  140.59 Mbits/sec (19.2:1)
Readback    - 1276.81 Mpixels/sec-  934.69 fps
620.622386 frames/sec - 847.785074 Mpixels/sec
Compress 0  -   84.02 Mpixels/sec-   61.51 fps
Total       -  107.23 Mpixels/sec-   78.50 fps-  134.42 Mbits/sec (19.1:1)

CPU Loading:
Server: 170% average
Client: 270% average


Visualization Enviroment 2:

VGL Server:
Hardware:
  CPU: 2*Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz (3.7 Ghz max)
  RAM: 256GB DDR4 ECC 2666MHz 
  GPU: Nvidia Quadro P5000 16GB [GP104GL]
  LAN: 1Gbit 

Software:
Ubuntu 18.04 LTS
VirtualGL 2.6.1

VGLClient (laptop):
 CPU: Intel(R) Core(TM) i7-6600 CPU @ 2.60GHz
 LAN: 1Gbit 
 Software: Windows 10 with Linux Ubuntu 18.04 WSL, VcXsrv X-Server

vglrun +pr /opt/VirtualGL/bin/glxspheres64

Readback    -  336.98 Mpixels/sec-  301.95 fps
44.329628 frames/sec - 49.471864 Mpixels/sec
Compress 0  -  122.81 Mpixels/sec-  110.04 fps
Total       -   32.88 Mpixels/sec-   29.46 fps-   55.18 Mbits/sec (14.3:1)
Readback    -  333.44 Mpixels/sec-  298.78 fps
44.390165 frames/sec - 49.539424 Mpixels/sec
Compress 0  -  122.34 Mpixels/sec-  109.62 fps
Total       -   27.66 Mpixels/sec-   24.78 fps-   47.42 Mbits/sec (14.0:1)

CPU Loading:
Server: 388% average
Client: 77% average

I don't understand why the CPU loading of the VGL server in environment 2 
is so high compared to environment 1.
There is no firewall active in the environments which may cause an overhead.
Do you have any hints where to look at?

Best regards

On Thursday, February 21, 2019 at 12:45:28 PM UTC+1, ulsch wrote:
>
> Think simple instead of complicated :-)
>
> Thank you very much, the port was blocked!
>
>
>>
>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"VirtualGL User Discussion/Support" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/virtualgl-users/0d039589-f831-4d38-925c-71ece53f0cf6%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to