As far as I know, yes. There's also a good rundown of build options on SCOREC:

https://redmine.scorec.rpi.edu/projects/albany-rpi/wiki/Installing_Trilinos


Trevor

________________________________
From: [email protected] <[email protected]> on behalf of Daniel Lewis 
<[email protected]>
Sent: Friday, December 2, 2016 12:52:47 PM
To: FIPY
Subject: Re: trilinos solver error

Trevor,

Are your previous build instructions still current?  Just wondering if I can 
reproduce either of these outcomes?

Dan

On Dec 2, 2016, at 11:39 AM, Keller, Trevor (Fed) <[email protected]> 
wrote:

> Hi Shaun,
>
> I am unable to reproduce this error in my environment, using FiPy version 
> 3.1.dev134+g64f7866 and PyTrilinos version12.3;(;Dev;). Here's the output I 
> get:
>
> $ mpirun -np 1 python examples/cahnHilliard/mesh2DCoupled.py --trilinos
> .../lib/python2.7/site-packages/matplotlib/collections.py:590: FutureWarning: 
> elementwise comparison failed; returning scalar instead, but in the future 
> will perform elementwise comparison
>   if self._edgecolors == str('face'):
>
> Coupled equations. Press <return> to proceed...
> False
>
> Does the example successfully run for you without Trilinos? without MPI? This 
> looks like a build environment problem, but I'm not sure exactly where.
>
> Trevor
> From: [email protected] <[email protected]> on behalf of Shaun Mucalo 
> <[email protected]>
> Sent: Thursday, December 1, 2016 7:37:34 PM
> To: FIPY
> Subject: Fwd: trilinos solver error
>
>
>
> Hello,
> I am trying to solve a problem with fipy using trilinos
> solvers. However, when I run a problem with a 2D mesh I get the
> following trilinos error:
>
>
> $ python fipy/examples/cahnHilliard/mesh2DCoupled.py --trilinos
> Error in dorgqr on 0 row (dims are 800, 1)
>
> Error in CoarsenMIS: dorgqr returned a non-zero
> n--------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
>
> I get a similar error when running the doctests (output of "mpirun -np 1
> python setup.py test --trilinos" is attached). I have manually tested
> ~20 of the examples, it appears the 1D cases work fine and fail on the
> 2D cases, for any value of -np.
>
> Incidentally, running the tests at -np > 1 and the tests stall at
> "Doctest:fipy.terms.abstractDiffusionTerm._AbstractDiffusionTerm._buildMatrix...".
> I am not sure if this is related.
>
> Does anyone have any advice? Is this unique to my installation, or is it
> a known issue? Please let me know if there is any other information I
> can provide that could be useful.
>
> Regards,
> Shaun Mucalo
> _______________________________________________
> fipy mailing list
> [email protected]
> http://www.ctcms.nist.gov/fipy
>  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]


_______________________________________________
fipy mailing list
[email protected]
http://www.ctcms.nist.gov/fipy
  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]
_______________________________________________
fipy mailing list
[email protected]
http://www.ctcms.nist.gov/fipy
  [ NIST internal ONLY: https://email.nist.gov/mailman/listinfo/fipy ]

Reply via email to