Re: [petsc-users] VecDuplicate for FFTW-Vec causes VecDestroy to fail conditionally on VecLoad

2019-11-01 Thread Sajid Ali via petsc-users
Hi Junchao/Barry, It doesn't really matter what the h5 file contains, so I'm attaching a lightly edited script of src/vec/vec/examples/tutorials/ex10.c which should produce a vector to be used as input for the above test case. (I'm working with ` --with-scalar-type=complex`). Now that I think

Re: [petsc-users] VecDuplicate for FFTW-Vec causes VecDestroy to fail conditionally on VecLoad

2019-11-01 Thread Smith, Barry F. via petsc-users
> On Nov 1, 2019, at 4:50 PM, Zhang, Junchao via petsc-users > wrote: > > I know nothing about Vec FFTW, You are lucky :-) > but if you can provide hdf5 files in your test, I will see if I can reproduce > it. > --Junchao Zhang > > > On Fri, Nov 1, 2019 at 2:08 PM Sajid Ali via

Re: [petsc-users] VecDuplicate for FFTW-Vec causes VecDestroy to fail conditionally on VecLoad

2019-11-01 Thread Zhang, Junchao via petsc-users
I know nothing about Vec FFTW, but if you can provide hdf5 files in your test, I will see if I can reproduce it. --Junchao Zhang On Fri, Nov 1, 2019 at 2:08 PM Sajid Ali via petsc-users mailto:petsc-users@mcs.anl.gov>> wrote: Hi PETSc-developers, I'm unable to debug a crash with VecDestroy

Re: [petsc-users] Do the guards against calling MPI_Comm_dup() in PetscCommDuplicate() apply with Fortran?

2019-11-01 Thread Smith, Barry F. via petsc-users
> On Nov 1, 2019, at 10:54 AM, Patrick Sanan wrote: > > Thanks, Barry. I should have realized that was an ancient version. The > cluster does have Open MPI 4.0.1 so I'll see if we can't use that instead. > (I'm sure that the old version is there just to provide continuity - the > weird

Re: [petsc-users] Do the guards against calling MPI_Comm_dup() in PetscCommDuplicate() apply with Fortran?

2019-11-01 Thread Smith, Barry F. via petsc-users
Certain OpenMPI versions have bugs where even when you properly duplicate and then free communicators it eventually "runs out of communicators". This is a definitely a bug and was fixed in later OpenMPI versions. We wasted a lot of time tracking down this bug in the past. By now it is an

Re: [petsc-users] VI: RS vs SS

2019-11-01 Thread Munson, Todd via petsc-users
Yes, that looks weird. Can you send me directly the linear problem (M, q, l, and u)? I will take a look and run some other diagnostics with some of my other tools. Thanks, Todd. > On Nov 1, 2019, at 10:14 AM, Alexander Lindsay > wrote: > > No, the matrix is not symmetric because of how we

Re: [petsc-users] VI: RS vs SS

2019-11-01 Thread Alexander Lindsay via petsc-users
No, the matrix is not symmetric because of how we impose some Dirichlet conditions on the boundary. I could easily give you the Jacobian, for one of the "bad" problems. But at least in the case of RSLS, I don't know whether the algorithm is performing badly, or whether the slow convergence is