Re: [petsc-users] petsc4py error code 86 from ViewerHDF5().create

2024-03-12 Thread adigitoleo (Leon)




 > You need to ./configure PETSc for HDF5 using > > > --with-fortran-bindings=0 --with-mpi-dir=/usr --download-hdf5 > Thanks, this has worked. I assumed PETSc would just pick up the HDF5 library I already had on my system but




ZjQcmQRYFpfptBannerStart




  

  
	This Message Is From an External Sender
  
  
This message came from outside your organization.
  



 
  


ZjQcmQRYFpfptBannerEnd




>You need to ./configure PETSc for HDF5 using
>
> > --with-fortran-bindings=0 --with-mpi-dir=/usr --download-hdf5
>

Thanks, this has worked. I assumed PETSc would just pick up the HDF5
library I already had on my system but perhaps that requires
--with-hdf5-dir=/usr or something similar? Would this HDF5 library need
to be configured for MPI as well?

The underworld3 test suite is mostly passing but I do get a handful of
failures coming from

petsc4py.PETSc.SNES.getConvergedReason()

giving -3 instead of the expected 0. But that's more a question for
underworld devs.

Leon



Re: [petsc-users] petsc4py error code 86 from ViewerHDF5().create

2024-03-12 Thread Barry Smith

   You need to ./configure PETSc for HDF5 using

> --with-fortran-bindings=0 --with-mpi-dir=/usr --download-hdf5

  It may need additional options, if it does then rerun the ./configure with 
the additional options it lists.


> On Mar 12, 2024, at 8:19 PM, adigitoleo (Leon)  wrote:
> 
> This Message Is From an External Sender
> This message came from outside your organization.
> Hello,
> 
> I'm new to the list and have a limited knowledge of PETSc so far, but
> I'm trying to use a software (underworld3) that relies on petsc4py.
> I have built PETSc with the following configure options:
> 
> --with-fortran-bindings=0 --with-mpi-dir=/usr
> 
> and `make test` gives me 160 failures which all seem to be timeouts or
> arising from my having insufficient "slots" (cores?). I subsequently
> built underworld3 with something like
> 
> cd $PETSC_DIR
> PETSC_DIR=... PETSC_ARCH=... NUMPY_INCLUDE=... pip install 
> src/binding/petsc4py
> cd /path/to/underworld3/tree
> pip install h5py
> pip install mpi4py
> PETSC_DIR=... PETSC_ARCH=... NUMPY_INCLUDE=... pip install -e .
> 
> following their instructions. Building their python wheel/package was
> successful, however when I run their tests (using pytest) I get errors
> during test collection, which all come from petsc4py and have a stack
> trace that ends in the snippet attached below. Am I going about this
> wrong? How do I ensure that the HDF5 types are defined?
> 
> src/underworld3/discretisation.py:86: in _from_gmsh
> viewer = PETSc.ViewerHDF5().create(filename + ".h5", "w", 
> comm=PETSc.COMM_SELF)
> petsc4py/PETSc/Viewer.pyx:916: in petsc4py.PETSc.ViewerHDF5.create
> ???
> E   petsc4py.PETSc.Error: error code 86
> --- Captured stderr 
> 
> [0]PETSC ERROR: - Error Message 
> --
> [0]PETSC ERROR: Unknown type. Check for miss-spelling or missing package: 
> https://urldefense.us/v3/__https://petsc.org/release/install/install/*external-packages__;Iw!!G_uCfscf7eWS!duVp7PZwdvHgymCgufX290k3tptJCHEo3vrV7dNt9zumYwqzDVsb1AG1HHargxq0LL-1JO6mjgiS7Vbykb1_siyXWw$
> [0]PETSC ERROR: Unknown PetscViewer type given: hdf5
> [0]PETSC ERROR: See 
> https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!duVp7PZwdvHgymCgufX290k3tptJCHEo3vrV7dNt9zumYwqzDVsb1AG1HHargxq0LL-1JO6mjgiS7Vbykb1LEFOicQ$
>  for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.20.4, unknown
> [0]PETSC ERROR: /home/leon/vcs/underworld3/.venv-underworld3/bin/pytest 
> on a arch-linux-c-debug named roci by leon Wed Mar 13 00:01:33 2024
> [0]PETSC ERROR: Configure options --with-fortran-bindings=0 
> --with-mpi-dir=/usr
> [0]PETSC ERROR: #1 PetscViewerSetType() at 
> /home/leon/vcs/petsc/src/sys/classes/viewer/interface/viewreg.c:535
> 
> Cheers,
> Leon



[petsc-users] petsc4py error code 86 from ViewerHDF5().create

2024-03-12 Thread adigitoleo (Leon)




 Hello, I'm new to the list and have a limited knowledge of PETSc so far, but I'm trying to use a software (underworld3) that relies on petsc4py. I have built PETSc with the following configure options: --with-fortran-bindings=0 --with-mpi-dir=/usr




ZjQcmQRYFpfptBannerStart




  

  
	This Message Is From an External Sender
  
  
This message came from outside your organization.
  



 
  


ZjQcmQRYFpfptBannerEnd




Hello,

I'm new to the list and have a limited knowledge of PETSc so far, but
I'm trying to use a software (underworld3) that relies on petsc4py.
I have built PETSc with the following configure options:

--with-fortran-bindings=0 --with-mpi-dir=/usr

and `make test` gives me 160 failures which all seem to be timeouts or
arising from my having insufficient "slots" (cores?). I subsequently
built underworld3 with something like

cd $PETSC_DIR
PETSC_DIR=... PETSC_ARCH=... NUMPY_INCLUDE=... pip install src/binding/petsc4py
cd /path/to/underworld3/tree
pip install h5py
pip install mpi4py
PETSC_DIR=... PETSC_ARCH=... NUMPY_INCLUDE=... pip install -e .

following their instructions. Building their python wheel/package was
successful, however when I run their tests (using pytest) I get errors
during test collection, which all come from petsc4py and have a stack
trace that ends in the snippet attached below. Am I going about this
wrong? How do I ensure that the HDF5 types are defined?

src/underworld3/discretisation.py:86: in _from_gmsh
viewer = PETSc.ViewerHDF5().create(filename + ".h5", "w", comm=PETSc.COMM_SELF)
petsc4py/PETSc/Viewer.pyx:916: in petsc4py.PETSc.ViewerHDF5.create
???
E   petsc4py.PETSc.Error: error code 86
--- Captured stderr 
[0]PETSC ERROR: - Error Message --
[0]PETSC ERROR: Unknown type. Check for miss-spelling or missing package: https://urldefense.us/v3/__https://petsc.org/release/install/install/*external-packages__;Iw!!G_uCfscf7eWS!duVp7PZwdvHgymCgufX290k3tptJCHEo3vrV7dNt9zumYwqzDVsb1AG1HHargxq0LL-1JO6mjgiS7Vbykb1_siyXWw$
[0]PETSC ERROR: Unknown PetscViewer type given: hdf5
[0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!duVp7PZwdvHgymCgufX290k3tptJCHEo3vrV7dNt9zumYwqzDVsb1AG1HHargxq0LL-1JO6mjgiS7Vbykb1LEFOicQ$ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.20.4, unknown
[0]PETSC ERROR: /home/leon/vcs/underworld3/.venv-underworld3/bin/pytest on a arch-linux-c-debug named roci by leon Wed Mar 13 00:01:33 2024
[0]PETSC ERROR: Configure options --with-fortran-bindings=0 --with-mpi-dir=/usr
[0]PETSC ERROR: #1 PetscViewerSetType() at /home/leon/vcs/petsc/src/sys/classes/viewer/interface/viewreg.c:535

Cheers,
Leon



Re: [petsc-users] Compile Error in configuring PETSc with Cygwin on Windows by using Intel MPI

2024-03-12 Thread Satish Balay via petsc-users
Glad you have a successful build! Thanks for the update.

Satish

On Tue, 12 Mar 2024, 程奔 wrote:

> Hi Satish Sorry for replying to your email so late, I follow your suggestion 
> and it have been installed successfully. Thank you so much. best wishes, Ben 
> > -原始邮件- > 发件人: "Satish Balay"  > 发送时间: 
> 2024-03-06
> ZjQcmQRYFpfptBannerStart
> This Message Is From an External Sender
> This message came from outside your organization.
>  
> ZjQcmQRYFpfptBannerEnd
> 
> Hi Satish
>   Sorry for replying to your email so late, I follow your suggestion and it 
> have been installed successfully.
>   Thank you so much.
> 
> best
> wishes,
> Ben
> 
> 
> > -原始邮件-
> > 发件人: "Satish Balay" 
> > 发送时间:2024-03-06 18:21:45 (星期三)
> > 收件人: 程奔 
> > 抄送: petsc-users@mcs.anl.gov
> > 主题: Re: [petsc-users] Compile Error in configuring PETSc with Cygwin on 
> > Windows by using Intel MPI
> > 
> > > make[3]: *** No rule to make target 'w'.  Stop.
> > 
> > Try the following to overcome the above error:
> > 
> > make OMAKE_PRINTDIR=make all
> > 
> > However 3.13.6 is a bit old - so don't know if it will work with these 
> > versions of compilers.
> > 
> > Satish
> > 
> > On Wed, 6 Mar 2024, 程奔 wrote:
> > 
> > > Hello,
> > > 
> > > 
> > > Last time I  installed PETSc 3.19.2 with Cygwin in Windows10 successfully.
> > > 
> > > Recently I try to install PETSc 3.13.6 with Cygwin since I'd like to use 
> > > PETSc with Visual Studio on Windows10 plateform.For the sake of clarity, 
> > > I firstly list the softwares/packages used below:
> > > 
> > > 1. PETSc: version 3.13.6
> > > 2. VS: version 2022 
> > > 3. Intel MPI: download Intel oneAPI Base Toolkit and HPC Toolkit
> > > 
> > > 
> > > 4. Cygwin
> > > 
> > > 5. External package: petsc-pkg-fblaslapack-e8a03f57d64c.tar.gz
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > And the compiler option in configuration is:
> > > 
> > > ./configure  --with-debugging=0  --with-cc='win32fe cl' 
> > > --with-fc='win32fe ifort' --with-cxx='win32fe cl'  
> > > 
> > > --download-fblaslapack=/cygdrive/g/mypetsc/petsc-pkg-fblaslapack-e8a03f57d64c.tar.gz
> > >   --with-shared-libraries=0 
> > > 
> > > --with-mpi-include=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/include 
> > > --with-mpi-lib=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/lib/release/impi.lib
> > >  
> > > 
> > > --with-mpiexec=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/bin/mpiexec 
> > > 
> > > 
> > > 
> > > 
> > > Then I build PETSc libraries with:
> > > 
> > > make PETSC_DIR=/cygdrive/g/mypetsc/petsc-3.13.6 
> > > PETSC_ARCH=arch-mswin-c-opt all
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > but there return an error:
> > > 
> > > **ERROR*
> > >   Error during compile, check arch-mswin-c-opt/lib/petsc/conf/make.log
> > >   Send it and arch-mswin-c-opt/lib/petsc/conf/configure.log to 
> > > petsc-ma...@mcs.anl.gov
> > > 
> > > 
> > > 
> > > 
> > > 
> > > 
> > > So I wrrit this email to report my problem and ask for your help.  
> > > 
> > > 
> > > Looking forward your reply!
> > > 
> > > 
> > > sinserely,
> > > Cheng.
> > > 
> > > 
> > > 
> > > 
> 
> 


Re: [petsc-users] Compile Error in configuring PETSc with Cygwin on Windows by using Intel MPI

2024-03-12 Thread 程奔




 Hi Satish Sorry for replying to your email so late, I follow your suggestion and it have been installed successfully. Thank you so much. best wishes, Ben > -原始邮件- > 发件人: "Satish Balay"  > 发送时间: 2024-03-06




ZjQcmQRYFpfptBannerStart




  

  
	This Message Is From an External Sender
  
  
This message came from outside your organization.
  



 
  


ZjQcmQRYFpfptBannerEnd




Hi Satish
  Sorry for replying to your email so late, I follow your suggestion and it have been installed successfully.
  Thank you so much.

best
wishes,
Ben


> -原始邮件-
> 发件人: "Satish Balay" 
> 发送时间:2024-03-06 18:21:45 (星期三)
> 收件人: 程奔 
> 抄送: petsc-users@mcs.anl.gov
> 主题: Re: [petsc-users] Compile Error in configuring PETSc with Cygwin on Windows by using Intel MPI
> 
> > make[3]: *** No rule to make target 'w'.  Stop.
> 
> Try the following to overcome the above error:
> 
> make OMAKE_PRINTDIR=make all
> 
> However 3.13.6 is a bit old - so don't know if it will work with these versions of compilers.
> 
> Satish
> 
> On Wed, 6 Mar 2024, 程奔 wrote:
> 
> > Hello,
> > 
> > 
> > Last time I  installed PETSc 3.19.2 with Cygwin in Windows10 successfully.
> > 
> > Recently I try to install PETSc 3.13.6 with Cygwin since I'd like to use PETSc with Visual Studio on Windows10 plateform.For the sake of clarity, I firstly list the softwares/packages used below:
> > 
> > 1. PETSc: version 3.13.6
> > 2. VS: version 2022 
> > 3. Intel MPI: download Intel oneAPI Base Toolkit and HPC Toolkit
> > 
> > 
> > 4. Cygwin
> > 
> > 5. External package: petsc-pkg-fblaslapack-e8a03f57d64c.tar.gz
> > 
> > 
> > 
> > 
> > 
> > 
> > 
> > 
> > 
> > 
> > And the compiler option in configuration is:
> > 
> > ./configure  --with-debugging=0  --with-cc='win32fe cl' --with-fc='win32fe ifort' --with-cxx='win32fe cl'  
> > 
> > --download-fblaslapack=/cygdrive/g/mypetsc/petsc-pkg-fblaslapack-e8a03f57d64c.tar.gz  --with-shared-libraries=0 
> > 
> > --with-mpi-include=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/include --with-mpi-lib=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/lib/release/impi.lib 
> > 
> > --with-mpiexec=/cygdrive/g/Intel/oneAPI/mpi/2021.10.0/bin/mpiexec 
> > 
> > 
> > 
> > 
> > Then I build PETSc libraries with:
> > 
> > make PETSC_DIR=/cygdrive/g/mypetsc/petsc-3.13.6 PETSC_ARCH=arch-mswin-c-opt all
> > 
> > 
> > 
> > 
> > 
> > 
> > 
> > but there return an error:
> > 
> > **ERROR*
> >   Error during compile, check arch-mswin-c-opt/lib/petsc/conf/make.log
> >   Send it and arch-mswin-c-opt/lib/petsc/conf/configure.log to petsc-ma...@mcs.anl.gov
> > 
> > 
> > 
> > 
> > 
> > 
> > So I wrrit this email to report my problem and ask for your help.  
> > 
> > 
> > Looking forward your reply!
> > 
> > 
> > sinserely,
> > Cheng.
> > 
> > 
> > 
> > 



[petsc-users] Fieldsplit, multigrid and DM interaction

2024-03-12 Thread Marco Seiz




 Hello, I'd like to solve a Stokes-like equation with PETSc, i. e. div( mu * symgrad(u) ) = -grad(p) - grad(mu*q) div(u) = q with the spatially variable coefficients (mu, q) coming from another application, which will advect and evolve fields




ZjQcmQRYFpfptBannerStart




  

  
	This Message Is From an External Sender
  
  
This message came from outside your organization.
  



 
  


ZjQcmQRYFpfptBannerEnd




Hello,


I'd like to solve a Stokes-like equation with PETSc, i.e.


div( mu * symgrad(u) ) = -grad(p) - grad(mu*q)

div(u) = q


with the spatially variable coefficients (mu, q) coming from another 
application, which will advect and evolve fields via the velocity field 
u from the Stokes solution, and throw back new (mu, q) to PETSc in a 
loop, everything using finite difference. In preparation for this and 
getting used to PETSc I wrote a simple inhomogeneous coefficient Poisson 
solver, i.e.

  div (mu*grad(u) = -grad(mu*q), u unknown,

based on src/ksp/ksp/tutorials/ex32.c which converges really nicely even 
for mu contrasts of 10^10 using -ksp_type fgmres -pc_type mg. Since my 
coefficients later on can't be calculated from coordinates, I put them 
on a separate DM and attached it to the main DM via PetscObjectCompose 
and used a DMCoarsenHookAdd to coarsen the DM the coefficients live on, 
inspired by src/ts/tutorials/ex29.c .

Adding another uncoupled DoF was simple enough and it converged 
according to -ksp_converged_reason, but the solution started looking 
very weird; roughly constant for each DoF, when it should be some 
function going from roughly -value to +value due to symmetry. This 
doesn't happen when I use a direct solver ( -ksp_type preonly -pc_type 
lu -pc_factor_mat_solver_type umfpack ) and reading the archives, I 
ought to be using -pc_type fieldsplit due to the block nature of the 
matrix. I did that and the solution looked sensible again.

Now here comes the actual problem: Once I try adding multigrid 
preconditioning to the split fields I get errors probably relating to 
fieldsplit not "inheriting" (for lack of a better term) the associated 
interpolations/added DMs and hooks on the fine DM. That is, when I use 
the DMDA_Q0 interpolation, fieldsplit dies because it switches to 
DMDA_Q1 and the size ratio is wrong ( Ratio between levels: (mx - 1)/(Mx 
- 1) must be integer: mx 64 Mx 32 ). When I use DMDA_Q1, once the KSP 
tries to setup the matrix on the coarsened problem the DM no longer has 
the coefficient DMs which I previously had associated with it, i.e. 
PetscCall(PetscObjectQuery((PetscObject)da, "coefficientdm", 
(PetscObject *)_coeff)); puts a NULL pointer in dm_coeff and PETSc 
dies when trying to get a named vector from that, but it works nicely 
without fieldsplit.

Is there some way to get fieldsplit to automagically "inherit" those 
added parts or do I need to manually modify the DMs the fieldsplit is 
using? I've been using KSPSetComputeOperators since it allows for 
re-discretization without having to manage the levels myself, whereas 
some more involved examples like src/dm/impls/stag/tutorials/ex4.c build 
the matrices in advance when re-discretizing and set them with 
KSPSetOperators, which would avoid the problem as well but also means 
managing the levels.


Any advice concerning solving my target Stokes-like equation is welcome 
as well. I am coming from a explicit timestepping background so reading 
up on saddle point problems and their efficient solution is all quite 
new to me.


Best regards,

Marco