Hi,

I'am writing gambas 2.8 applications (clients) which are runnig on a 
terminalserver-cluster (nx). The problem I have is, that each gambas 2.8 
programm increases its consumption of systemresources extremly after 
connecting to a server (TCP). I've tried, whether this behavior also 
happens, when a very simple client-application connects to an also very 
simple server-application on localhost, and found out, that there are 
the same problems.

Without clientsockets I can let up to 40 people to work on one 
terminalserver. With clientsockets only 15 to 20 people an work on the 
same machine.

Has anyone an idea?

Thanks

Lars

-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
Gambas-user mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/gambas-user

Reply via email to