This is not expected. To eliminate some problem with the MPI I would first
select a new PETSC_ARCH value say arch-mpich and do a
./configure --download-mpich if you do not get a problem then the problem
is due to OpenMPI if you get a problem it is likely due to PETSc or your code.
Hi,
After updating from PETSc 2.7.5 to 2.7.7 (alongside openmpi, but I don't
which version was installed, now is 2.1.1), my application started failing
insinde KspSetUp() when run with more than 1 process. GDB backtrace:
#0 0x7fffca57f798 in mca_pml_ob1_recv_request_progress_rget () from
> On Mar 1, 2018, at 10:56 AM, Edoardo alinovi
> wrote:
>
> Dear Barry,
>
> thank you very much for the help. I will try to define the matrix as global
> and let you know if I have some problems. So just to be sure the following
> procedure is allowed:
>
> -
> On Mar 1, 2018, at 10:24 AM, Edoardo alinovi
> wrote:
>
> Dear All,
>
> thanks to your suggestions and Petsc, I have finished to program my finite
> volume code for CFD that runs in parallel and this is great.
>
> I am now tryig to improve its performace and I
Dear All,
thanks to your suggestions and Petsc, I have finished to program my finite
volume code for CFD that runs in parallel and this is great.
I am now tryig to improve its performace and I have a question on matrix in
MPIAij format.
Basically after the discetization, I have one matrix for