Dear Support,

That was a very interesting post (and followup on sage-devel).

We seem to be having some similar behavior (too many processes and/or
all swap memory in use, even with relatively few users) on our (non-
public) server, which is luckily set up in a vmware environment on a
Linux machine, and I'm wondering whether some sort of limiting of
number of processes per user could help us.

I have a calc lab worksheet which requires heavy use of some @interact
cells featuring graphs, and the machine consistently gets completely
clogged, to the point where some instances of the Sage notebook cannot
find the Sage libraries (?! does that even make sense?) and where
eventually people can log in but then are unable to evaluate even one
cell successfully.  The machine does not stop, but it essentially has
no free memory from then on in.

I have absolutely no idea how interact works on the Python level.  Is
it possible that many many processes that don't stop are being created
ala what William/Michael described, just as a natural result of (let's
say) 10 people using interact, sliding every 30 seconds or so?

If that makes no sense at all, is it possible that nonetheless those
images or results are being cached in memory (I know that this is
related to Harald/Jason's recent thread but I again don't quite get
how interact works on a technical level) to the point that all virtual
memory in the vmware image could be used up quickly in a similar
scenario?

Thanks to any with ideas - I'm hoping to be able to guarantee everyone
to actually be able to *do* the lab for the next iteration!

- kcrisman
--~--~---------~--~----~------------~-------~--~----~
To post to this group, send email to sage-support@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/sage-support
URLs: http://www.sagemath.org
-~----------~----~----~----~------~----~------~--~---

Reply via email to