Dear Daniel,

Thank you. I got the point.

I have made a run with the original mesh20x20.py example commenting out the 
viewer lines and using Pyparse on 1 node 1 CPU and got:
True
Implicit transient diffusion. Press <return> to proceed...


Then with 8 cores on 1 node exporting FIPY_SOLVERS=Trilinos" I got after a long 
run:
2 total processes killed (some possibly by mpirun during cleanup)
mpirun noticed that process rank 1 with PID 30180 on node m466 exited on signal 
11 (Segmentation fault).


Thank you.
Best wishes,
Ferenc

*
"If there's something you can't change then why worry about it?" by Fauja Singh,
 a 100 years old marathon runner
*




On Jan 4, 2012, at 6:13 PM, Daniel Wheeler wrote:

> 
> 
> On Wed, Jan 4, 2012 at 12:08 PM, Daniel Wheeler <[email protected]> 
> wrote:
> On Wed, Jan 4, 2012 at 11:56 AM, Ferenc Tasnadi <[email protected]> wrote:
> Dear Daniel,
> 
> 
> Thank you for your response.
> 
> In my script I do not use any viewer.
> 
> In your first email you said that mesh20x20.py doesn't work for you in 
> parallel. I just wanted to confirm that the problem
> 
> Sorry, to be clear, when I refer to mesh20x20.py, I mean the escript in 
> examples/diffusion/mesh20x20.py and not the script you posted.
> 
> -- 
> Daniel Wheeler
> _______________________________________________
> fipy mailing list
> [email protected]
> http://www.ctcms.nist.gov/fipy
>  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]

_______________________________________________
fipy mailing list
[email protected]
http://www.ctcms.nist.gov/fipy
  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]

Reply via email to