I did have two mpi's on my system which seems to be the cause of the
problem. (apt-get install paraview install openmpi ). Explicitly
pointing the libmesh configuration to the mpich2 binaries made mumps work!
Thanks a lot!
Jens
On 04/18/2012 02:39 PM, Dmitry Karpeev wrote:
>
>
> On Wed, A
On Wed, Apr 18, 2012 at 1:32 PM, John Peterson wrote:
> On Wed, Apr 18, 2012 at 12:05 PM, Dmitry Karpeev
> wrote:
> > On Wed, Apr 18, 2012 at 11:47 AM, Jens Lohne Eftang
> > wrote:
> >>
> >> petsch make test runs ex19 with 1 and 2 mpi processes and ex5f with 1
> mpi
> >> process sucessfully.
>
On Wed, Apr 18, 2012 at 12:05 PM, Dmitry Karpeev wrote:
> On Wed, Apr 18, 2012 at 11:47 AM, Jens Lohne Eftang
> wrote:
>>
>> petsch make test runs ex19 with 1 and 2 mpi processes and ex5f with 1 mpi
>> process sucessfully.
>
> I'm guessing the problem is with the way libMesh uses PETSc's compiler
On Wed, Apr 18, 2012 at 11:47 AM, Jens Lohne Eftang wrote:
> petsch make test runs ex19 with 1 and 2 mpi processes and ex5f with 1 mpi
> process sucessfully.
>
I'm guessing the problem is with the way libMesh uses PETSc's compilers.
I'm not sure exactly how libMesh deals with it when PETSc doesn'
petsch make test runs ex19 with 1 and 2 mpi processes and ex5f with 1
mpi process sucessfully.
mpicc -show returns
gcc -I/usr/local/include -L/usr/local/lib -Wl,-rpath,/usr/local/lib
-lmpich -lopa -lmpl -lrt -lpthread
Thanks again!
Jens
On 04/17/2012 11:23 PM, Dmitry Karpeev wrote:
> The PE
The PETSc configuration seems to be fine.
Are you able to run PETSc tests?
cd /home/eftang/fem_software/petsc-3.2-p5
make PETSC_DIR=/home/eftang/fem_software/petsc-3.2-p5
PETSC_ARCH=arch-linux2-c-opt test
The compiler that gets configured by PETSc is a wrapper C compiler
inherited from mpich
Check
On Mon, Apr 16, 2012 at 5:45 PM, Jens Lohne Eftang wrote:
> On 04/16/2012 07:31 PM, John Peterson wrote:
>>
>> On Mon, Apr 16, 2012 at 5:23 PM, Jens Lohne Eftang
>> wrote:
>>>
>>> Thanks for you reply.
>>>
>>> the libmesh_LIBS output has references to mpi, -lmpich and -lmpichf90.
>>> Would
>>> it
On 04/16/2012 07:31 PM, John Peterson wrote:
> On Mon, Apr 16, 2012 at 5:23 PM, Jens Lohne Eftang wrote:
>> Thanks for you reply.
>>
>> the libmesh_LIBS output has references to mpi, -lmpich and -lmpichf90. Would
>> it help to post the whole output?
> Are they preceded by something like -Wl,-rpath
On Mon, Apr 16, 2012 at 5:23 PM, Jens Lohne Eftang wrote:
> Thanks for you reply.
>
> the libmesh_LIBS output has references to mpi, -lmpich and -lmpichf90. Would
> it help to post the whole output?
Are they preceded by something like -Wl,-rpath, in the libmesh_LIBS output?
Perhaps something lik
Thanks for you reply.
the libmesh_LIBS output has references to mpi, -lmpich and -lmpichf90.
Would it help to post the whole output?
Jens
On 04/16/2012 10:03 AM, John Peterson wrote:
> On Sun, Apr 15, 2012 at 2:11 PM, Jens Lohne Eftang wrote:
>> Dear all,
>>
>> I want to use the mumps direct s
This seems to be a petsc configuration problem.
Have your MPI libraries moved by any chance?
>From the form of the missing symbol, I suspect the Fortran name mangling
may be the culprit.
Can you send PETSc's configure.log so I can take a look?
Thanks.
Dmitry.
On Mon, Apr 16, 2012 at 9:03 AM, John
On Sun, Apr 15, 2012 at 2:11 PM, Jens Lohne Eftang wrote:
> Dear all,
>
> I want to use the mumps direct sparse solvers. I've (seemingly)
> successfully configured and compiled petsc with mumps, and then
> configured and compiled libmesh. Compiling programs works fine, but when
> I try to run my p
Dear all,
I want to use the mumps direct sparse solvers. I've (seemingly)
successfully configured and compiled petsc with mumps, and then
configured and compiled libmesh. Compiling programs works fine, but when
I try to run my program with
./program -pc_factor_mat_solver_package mumps
I get t
13 matches
Mail list logo