Hi,

I am experience a massive performance drop when running a 32-bit binary through 
VirtualGL from my Debian (64-bit) installation.
The frames/sec with glxspheres64 are very good and I have no problems so far 
running any 64-bit binary through VirtualGL.

vglrun +pr /opt/VirtualGL/bin/glxspheres64 
[VGL] NOTICE: Automatically setting VGL_CLIENT environment variable to
[VGL]    192.168.100.18, the IP address of your SSh client.
Polygons in scene: 62464
Visual ID of window: 0x21
Context is Direct
OpenGL Renderer: Gallium 0.4 on AMD BARTS
Readback    -   57.53 Mpixels/sec-   51.55 fps
36.439298 frames/sec - 40.666257 Mpixels/sec
Compress 0  -   90.28 Mpixels/sec-   80.90 fps
Total       -   40.71 Mpixels/sec-   36.48 fps-   68.47 Mbits/sec (14.3:1)
Readback    -   68.25 Mpixels/sec-   61.15 fps
42.498992 frames/sec - 47.428875 Mpixels/sec
Compress 0  -   90.19 Mpixels/sec-   80.82 fps
Total       -   47.43 Mpixels/sec-   42.50 fps-   80.53 Mbits/sec (14.1:1)
Readback    -   68.11 Mpixels/sec-   61.03 fps
42.497685 frames/sec - 47.427416 Mpixels/sec
Compress 0  -   90.09 Mpixels/sec-   80.72 fps
Total       -   47.43 Mpixels/sec-   42.50 fps-   81.47 Mbits/sec (14.0:1)
Readback    -   68.09 Mpixels/sec-   61.02 fps
42.498414 frames/sec - 47.428230 Mpixels/sec
Compress 0  -   90.32 Mpixels/sec-   80.93 fps
Total       -   47.42 Mpixels/sec-   42.49 fps-   80.67 Mbits/sec (14.1:1)


Here is the result from running glxspheres (32-bit):

vglrun +pr /opt/VirtualGL/bin/glxspheres
[VGL] NOTICE: Automatically setting VGL_CLIENT environment variable to
[VGL]    192.168.100.18, the IP address of your SSh client.
Polygons in scene: 62464
Visual ID of window: 0x21
Context is Direct
OpenGL Renderer: Gallium 0.4 on AMD BARTS
Readback    -   12.18 Mpixels/sec-   10.91 fps
9.744401 frames/sec - 10.874751 Mpixels/sec
Compress 0  -   64.64 Mpixels/sec-   57.92 fps
Total       -   10.88 Mpixels/sec-    9.75 fps-   18.05 Mbits/sec (14.5:1)
Readback    -   12.18 Mpixels/sec-   10.92 fps
9.658351 frames/sec - 10.778720 Mpixels/sec
Compress 0  -   64.08 Mpixels/sec-   57.42 fps
Total       -   10.79 Mpixels/sec-    9.67 fps-   18.61 Mbits/sec (13.9:1)
Readback    -   12.16 Mpixels/sec-   10.90 fps
9.484710 frames/sec - 10.584937 Mpixels/sec
Compress 0  -   65.05 Mpixels/sec-   58.29 fps
Total       -   10.59 Mpixels/sec-    9.48 fps-   17.28 Mbits/sec (14.7:1)
Readback    -   12.19 Mpixels/sec-   10.93 fps
9.660761 frames/sec - 10.781410 Mpixels/sec
Compress 0  -   63.94 Mpixels/sec-   57.30 fps
Total       -   10.78 Mpixels/sec-    9.66 fps-   18.69 Mbits/sec (13.8:1)


If I run /opt/VirtualGL/bin/glxspheres and /opt/VirtualGL/bin/glxspheres64 
stand-alone (without vglrun) I get 85 frames/sec in both cases.
I have no idea whats causing this massive performance drop. Does anyone have an 
explanation or any hints for me?

Regards,

Ingo

------------------------------------------------------------------------------
All the data continuously generated in your IT infrastructure contains a
definitive record of customers, application performance, security
threats, fraudulent activity and more. Splunk takes this data and makes
sense of it. Business sense. IT sense. Common sense.
http://p.sf.net/sfu/splunk-d2d-oct
_______________________________________________
VirtualGL-Users mailing list
VirtualGL-Users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/virtualgl-users

Reply via email to