Hi all,
I wrote a petsc code, and I defined my variables in a separate file, namely
common.h, and common.c as following
common.h:
extern int i;
extern int n;
common.c:
#include
#include "common.h
PetscScalar i; // Define i and initialize
PetscScalar n; // Define n and
We do a MPI_Comm_dup() for objects related to externalpackages.
Looks like we added a new mat type MATHYPRE - in 3.8 that PCHYPRE is
using. Previously there was one MPI_Comm_dup() PCHYPRE - now I think
is one more for MATHYPRE - so more calls to MPI_Comm_dup in 3.8 vs 3.7
Hi Yann,
Thanks for pointing this out to us. Matt and I are the two most
actively developing in this area. We have been working on separate
threads and this looks like an issue where we need to sync up. I
think there is a simple fix, but it would be helpful to know which
version petsc you're
Sorry - I was going down the wrong path..
Sure MPI_COMM_WORLD vs PETSC_COMM_WORLD shouldn't make a difference
[except for a couple of extra mpi_comm_dup() calls.]
Satish
On Tue, 3 Apr 2018, Derek Gaston wrote:
> I’m working with Fande on this and I would like to add a bit more. There
> are
Are we sure this is a PETSc comm issue and not a hypre comm duplication issue
frame #6: 0x0001061345d9
libpetsc.3.07.dylib`hypre_GenerateSubComm(comm=-1006627852,
participate=, new_comm_ptr=) + 409 at
gen_redcs_mat.c:531 [opt]
Looks like hypre is needed to generate subcomms, perhaps
I’m working with Fande on this and I would like to add a bit more. There
are many circumstances where we aren’t working on COMM_WORLD at all (e.g.
working on a sub-communicator) but PETSc was initialized using
MPI_COMM_WORLD (think multi-level solves)… and we need to create
arbitrarily many PETSc
Why we do not use user-level MPI communicators directly? What are potential
risks here?
Fande,
On Mon, Apr 2, 2018 at 5:08 PM, Satish Balay wrote:
> PETSC_COMM_WORLD [via PetscCommDuplicate()] attempts to minimize calls to
> MPI_Comm_dup() - thus potentially avoiding such
PETSC_COMM_WORLD [via PetscCommDuplicate()] attempts to minimize calls to
MPI_Comm_dup() - thus potentially avoiding such errors
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscCommDuplicate.html
Satish
On Mon, 2 Apr 2018, Kong, Fande wrote:
> On Mon, Apr 2, 2018 at 4:23
On Mon, Apr 2, 2018 at 4:23 PM, Satish Balay wrote:
> Does this 'standard test' use MPI_COMM_WORLD' to crate PETSc objects?
>
> If so - you could try changing to PETSC_COMM_WORLD
>
I do not think we are using PETSC_COMM_WORLD when creating PETSc objects.
Why we can not use
Does this 'standard test' use MPI_COMM_WORLD' to crate PETSc objects?
If so - you could try changing to PETSC_COMM_WORLD
Satish
On Mon, 2 Apr 2018, Kong, Fande wrote:
> Hi All,
>
> I am trying to upgrade PETSc from 3.7.6 to 3.8.3 for MOOSE and its
> applications. I have a error message for a
Nope.
There is a back trace:
** thread #1: tid = 0x3b477b4, 0x7fffb306cd42
libsystem_kernel.dylib`__pthread_kill + 10, queue =
'com.apple.main-thread', stop reason = signal SIGABRT * frame #0:
0x7fffb306cd42 libsystem_kernel.dylib`__pthread_kill + 10
maybe this will fix ?
*diff --git a/src/ksp/pc/impls/hypre/hypre.c
b/src/ksp/pc/impls/hypre/hypre.c*
*index 28addcf533..6a756d4c57 100644*
*--- a/src/ksp/pc/impls/hypre/hypre.c*
*+++ b/src/ksp/pc/impls/hypre/hypre.c*
@@ -142,8 +142,7 @@ static PetscErrorCode PCSetUp_HYPRE(PC pc)
ierr =
Hi All,
I am trying to upgrade PETSc from 3.7.6 to 3.8.3 for MOOSE and its
applications. I have a error message for a standard test:
*preconditioners/pbp.lots_of_variables: MPI had an
errorpreconditioners/pbp.lots_of_variables:
Hi,
I'm using DMForestTransferVec, as in "dm/impls/forest/examples/tests/ex2.c".
I would like to use it with a space approximation order at zero
(-petscspace_order 0). However, in this case, it's not working (valgrind
output of ex2.c from forest test) :
==8604== Conditional jump or move
14 matches
Mail list logo