My plots are in KiBs, not MiBs (I didn't do anything to post-process the output 
of syrupy). >10 GiB for 1.3 M cells is consistent with our typical (excessive) 
memory usage.

We've been using Trilinos for a long time (~10 years?) so if you can manage to 
build them, you can try Trilinos versions going quite a way back.

As far as the previous calculations, you're right, of course. When I clear out 
previous results, it still doesn't leak (for me), but it runs forever.

On Mar 30, 2016, at 1:10 PM, Michael Waters <[email protected]> wrote:

> Hi Jon, 
> 
> Last night, I compiled and tried Trilinos 11.10 from the Trilinos git repo, I 
> am still getting the memory leak. I see it in both RSS and VSIZE. Looking 
> back at your memory numbers from your previous email, they seem too small for 
> even storing the float data for all the cells. 
> 
> The slow convergence that Trevor is seeing is normal. I did some more testing 
> and found that using initial_solver_iterations_per_step = 20 will improve 
> convergence relative to the number of self-consistent iterations in this 
> example. The total energy should go to about -316.xx eV when it is converged. 
> In the terminal output, it looks like this:
> 
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> 
> Step 24 Total Energy -315.957897784 dE 1.86055284542 dE/|toten| 
> 0.00588861002833 max_rel_error 0.125229938669
> 
> Also, the script will read previous calculations as starting conditions, to 
> prevent this, delete the psi_e_*.txt files. 
> 
> I am running these calculations on Ubuntu 14.04 64bit. Trevor, what are you 
> running on? I might be able to help with compilation. 
> 
> What is the oldest version of Trilinos that FiPy has used? I could try that 
> and see if my problem persists.
> 
> Thanks again!
> -Mike Waters
> 
> On 3/30/16 11:52 AM, Guyer, Jonathan E. Dr. (Fed) wrote:
>> It looked to me like steps and accuracy were the way to do it, but my runs 
>> finish in one step, so I was confused. When I change to accuracy = 10.0**-6, 
>> it takes 15 steps, but still no leak (note, the hiccup in RSS and in ELAPSED 
>> time is because I put my laptop to sleep for awhile, but VSIZE is 
>> rock-steady).
>> 
>> The fact that things never (or slowly) converge for you and Trevor, in 
>> addition to the leak, makes me wonder if Trilinos seriously broke something 
>> between 11.x and 12.x. Trevor's been struggling to build 12.4. I'll try to 
>> find time to do the same.
>> 
>> In case it matters, I'm running on OS X. What's your system?
>> 
>> - Jon
>> 
>> On Mar 29, 2016, at 3:59 PM, Michael Waters <[email protected]> wrote:
>> 
>>> When I did my testing and made those graphs, I ran Trilinos in serial. 
>>> Syrupy didn't seem to track the other processes memory. I watched in 
>>> real time as the parallel version ate all my ram though.
>>> 
>>> To make the program run longer while not changing the memory:
>>> 
>>> steps = 100  # increase this, (limits the number of self-consistent 
>>> iterations)
>>> accuracy = 10.0**-5 # make this number smaller, (relative energy 
>>> eigenvalue change for being considered converged )
>>> initial_solver_iterations_per_step = 7 # reduce this to 1,  (number of 
>>> solver iterations per self-consistent iteration, to small and it's slow, 
>>> to high and the solutions are not stable)
>>> 
>>> I did those tests on a machine with 128 GB of ram so I wasn't expecting 
>>> any swapping.
>>> 
>>> Thanks,
>>> -mike
>>> 
>>> 
>>> On 3/29/16 3:38 PM, Guyer, Jonathan E. Dr. (Fed) wrote:
>>>> I guess I spoke too soon. FWIW, I'm running Trilinos version: 11.10.2.
>>>> 
>>>> 
>>>> On Mar 29, 2016, at 3:34 PM, Guyer, Jonathan E. Dr. (Fed) 
>>>> <[email protected]> wrote:
>>>> 
>>>>> I'm not seeing a leak. The below is for trilinos. VSIZE grows to about 11 
>>>>> MiB and saturates and RSS saturates at around 5 MiB. VSIZE is more 
>>>>> relevant for tracking leaks, as RSS is deeply tied to your system's 
>>>>> swapping architecture and what else is running; either way, neither seems 
>>>>> to be leaking, but this problem does use a lot of memory.
>>>>> 
>>>>> What do I need to do to get it to run longer?
>>>>> 
>>>>> 
>>>>> 
>>>>> On Mar 25, 2016, at 7:16 PM, Michael Waters <[email protected]> 
>>>>> wrote:
>>>>> 
>>>>>> Hello,
>>>>>> 
>>>>>> I still have a large memory leak when using Trilinos. I am not sure 
>>>>>> where to start looking so I made an example code that produces my 
>>>>>> problem in hopes that someone can help me.
>>>>>> 
>>>>>> But! my example is cool. I implemented Density Functional Theory in FiPy!
>>>>>> 
>>>>>> My code is slow, but runs in parallel and is simple (relative to most 
>>>>>> DFT codes). The example I have attached is just a lithium and hydrogen 
>>>>>> atom. The electrostatic boundary conditions are goofy but work well 
>>>>>> enough for demonstration purposes. If you set use_trilinos to True, the 
>>>>>> code will slowly use more memory. If not, it will try to use Pysparse.
>>>>>> 
>>>>>> Thanks,
>>>>>> -Michael Waters
>>>>>> <input.xyz><fipy-dft.py>_______________________________________________
>>>>>> fipy mailing list
>>>>>> [email protected]
>>>>>> http://www.ctcms.nist.gov/fipy
>>>>>>  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
>>>>> 
>>>>> <attachment.png>
>>>>> _______________________________________________
>>>>> fipy mailing list
>>>>> [email protected]
>>>>> http://www.ctcms.nist.gov/fipy
>>>>>  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
>>>> 
>>>> _______________________________________________
>>>> fipy mailing list
>>>> [email protected]
>>>> http://www.ctcms.nist.gov/fipy
>>>>   [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
>>> 
>>> _______________________________________________
>>> fipy mailing list
>>> [email protected]
>>> http://www.ctcms.nist.gov/fipy
>>>  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
>> <Mail Attachment.png>
>> 
>> 
>> _______________________________________________
>> fipy mailing list
>> 
>> [email protected]
>> http://www.ctcms.nist.gov/fipy
>> 
>>   [ NIST internal ONLY: 
>> https://email.nist.gov/mailman/listinfo/fipy
>>  ]
>> 
> 
> _______________________________________________
> fipy mailing list
> [email protected]
> http://www.ctcms.nist.gov/fipy
>  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]


_______________________________________________
fipy mailing list
[email protected]
http://www.ctcms.nist.gov/fipy
  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]

Reply via email to