Hi everybody!
I am experiencing some delay issues using VirtualGL due to the bandwith on
my network. For example, when I press a button to start rotating a model as
fast as the graphic can go, if the network is working at maximum (there is a
bottleneck on it), when I press the stop button there is a delay that can be
from ms to seconds. I tried the frame spoiling option ("+sp") but what I
feel is that it just drops frames when the client can't process them but if
the problem is on the bandwith virtualGL just works as usual.
I would like to know if I am doing something wrong or it is just the option
is not implemented yet. In that case I would like to implemented it by
myself but I need some help with the code. I have been following inside the
code the option -fps and +sp trying to see where is done the frame spoiling
but until now I have not been very lucky with it. So, if anyone knows the
code better than I do (I am sure everyone will do :) ) I would be very
thankful if can tell me where to look at.
Thank you very much!
------------------------------------------------------------------------------
The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE:
Pinpoint memory and threading errors before they happen.
Find and fix more than 250 security defects in the development cycle.
Locate bottlenecks in serial and parallel code that limit performance.
http://p.sf.net/sfu/intel-dev2devfeb
_______________________________________________
VirtualGL-Users mailing list
VirtualGL-Users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/virtualgl-users