While running solve() on a system of two big equations over the course of three days, I came back to find what I'd consider bizarre memory usage. Solve was not complete (not unexpected) and the python process had a commit charge of 100GB with only 6GB in the working set (unexpected). The peak working set was a number I can't remember, but it was at most 50GB and most likely closer to 32GB. The last time I saw memory usage like this (high commit charge, low working set), the python module I was using had a reference leak.
Is this memory usage suspicious? Does sympy purport to be able to solve a system of two big nonlinear equations? Anticipating a possible response to that second question: I had played with using nonlinsolve() instead of solve() initially, but had better success with solve() on a simplified version of the equations where many symbols had been numerically substituted. So I tried solve() for the non-substituted version and got the strange memory behavior above. I am using the latest version of SymPy as pulled from the github repo a week or two ago. Python 2.7. Windows 7. Intel Xeon E5-something v3, 6 cores w/ hyperthreading, 64GB physical memory. I love SymPy, especially the mechanics module. Thanks for this great software. -- You received this message because you are subscribed to the Google Groups "sympy" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To post to this group, send email to [email protected]. Visit this group at https://groups.google.com/group/sympy. To view this discussion on the web visit https://groups.google.com/d/msgid/sympy/597e8b75-f709-4abb-a33b-ef6835a201f6%40googlegroups.com. For more options, visit https://groups.google.com/d/optout.
