Re: [petsc-users] PCHPDDM and matrix type

2023-04-17 Thread Pierre Jolivet
1) PCHPDDM handles AIJ, BAIJ, SBAIJ, IS, NORMAL, NORMALHERMITIAN, 
SCHURCOMPLEMENT, HTOOL
2) This PC is based on domain decomposition, with no support yet for “over 
decomposition”. If you run with a single process, it’s like PCASM or PCBJACOBI, 
you’ll get the same behavior as if you were just using the sub PC (in this 
case, an exact factorization)
3) The error you are seeing is likely due to a failure while coarsening, I will 
ask you for some info
4) Unrelated, but you should probably not use --with-cxx-dialect=C++11 and 
instead stick to --with-cxx-dialect=11 (unless you have a good reason to)

Thanks,
Pierre

> On 18 Apr 2023, at 1:26 AM, Alexander Lindsay  
> wrote:
> 
> I don't really get much more of a stack trace out:
> 
> [0]PETSC ERROR: [1]PETSC ERROR: - Error Message 
> --
> [0]PETSC ERROR: Invalid argument
> [0]PETSC ERROR: - Error Message 
> --
>  
> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be 
> the program crashed before they were used or a spelling mistake, etc!
> [1]PETSC ERROR: Invalid argument
> [1]PETSC ERROR:  
> [0]PETSC ERROR:   Option left: name:-i value: full_upwinding_2D.i source: 
> command line
> [0]PETSC ERROR: [1]PETSC ERROR: WARNING! There are option(s) set that were 
> not used! Could be the program crashed before they were used or a spelling 
> mistake, etc!
>   Option left: name:-ksp_converged_reason value: ::failed source: code
> [0]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_mat_type value: baij 
> source: command line
> [0]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_pc_type value: lu 
> source: command line
> [1]PETSC ERROR:   Option left: name:-i value: full_upwinding_2D.i source: 
> command line
> [1]PETSC ERROR:   Option left: name:-ksp_converged_reason value: ::failed 
> source: code
> [0]PETSC ERROR:   Option left: name:-snes_converged_reason value: ::failed 
> source: code
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d  
> GIT Date: 2023-04-16 17:35:24 +
> [1]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_mat_type value: baij 
> source: command line
> [1]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_pc_type value: lu 
> source: command line
> [0]PETSC ERROR: ../../../moose_test-opt on a arch-moose named rod.hpc.inl.gov 
>  by lindad Mon Apr 17 16:11:09 2023
> [0]PETSC ERROR: Configure options --download-hypre=1 
> --with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0   
> --with-debugging=no --download-fblaslapack=1 --download-metis=1 
> --download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1 
> --download-mumps=1 --download-strumpack=1 --download-scalapack=1 
> --download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11 
> --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices  
> --with-make-np=256 --download-hpddm
> [1]PETSC ERROR:   Option left: name:-snes_converged_reason value: ::failed 
> source: code
> [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [1]PETSC ERROR: [0]PETSC ERROR: #1 buildTwo() at 
> /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012
> Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d  GIT Date: 
> 2023-04-16 17:35:24 +
> [1]PETSC ERROR: ../../../moose_test-opt on a arch-moose named rod.hpc.inl.gov 
>  by lindad Mon Apr 17 16:11:09 2023
> [1]PETSC ERROR: Configure options --download-hypre=1 
> --with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0   
> --with-debugging=no --download-fblaslapack=1 --download-metis=1 
> --download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1 
> --download-mumps=1 --download-strumpack=1 --download-scalapack=1 
> --download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11 
> --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices  
> --with-make-np=256 --download-hpddm
> [1]PETSC ERROR: #1 buildTwo() at 
> /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012
> 
> On Mon, Apr 17, 2023 at 4:55 PM Matthew Knepley  > wrote:
>> I don't think so. Can you show the whole stack?
>> 
>>   THanks,
>> 
>> Matt
>> 
>> On Mon, Apr 17, 2023 at 6:24 PM Alexander Lindsay > > wrote:
>>> If it helps: if I use those exact same options in serial, then no errors 
>>> and the linear solve is beautiful :-) 
>>> 
>>> On Mon, Apr 17, 2023 at 4:22 PM Alexander Lindsay >> > wrote:
 I'm likely revealing a lot of ignorance, but in order to use HPDDM as a 
 preconditioner does my system matrix (I am using the same matrix for A and 
 P) need to be block type, e.g. baij or sbaij ? In MOOSE 

Re: [petsc-users] Composing different Field Components into single Vector (or data array)

2023-04-17 Thread Blaise Bourdin



Hi,


You can simply is VecISCopy.
Have  a look at the construction of isU and friends at ex26.c:291, and calls to VecISCopy around line 357.


Blaise



On Apr 17, 2023, at 6:54 PM, James Wright  wrote:



Thanks for the reply! The `Vec`s are created through DMs, not sections, but DMPlex uses sections, so I think this still applies. Super/Sub `DM`s sounds like a good solution. The only thing that I'm
 not seeing in the examples is being able to take `Vec` data from one a sub `DM`'s `Vec` to a super `DM`'s `Vec`.

To be a bit more explicit, let's say I have 3 fields, A, B, C. I'm imagining creating a "main" `DM` with those fields and then creating sub `DM`s, dmA, dmB, dmC. I need to have, in  a single `Vec` fields B and C, A and C together. So I'd create two super DMs
 dm_AC and dm_BC.

If I have a global `Vec` on dm_BC, is it possible that I can get the C data onto a dm_AC `Vec`? I think I could using SubVectors and VecISCopy, but am not entirely sure. There doesn't appear to be an example in the tutorials anywhere that puts these pieces
 together in this way.







Thanks,






James Wright
Graduate Research Assistant, PhD
University of Colorado Boulder

Cell: (864) 498 8869 
Email:
ja...@jameswright.xyz
Website:
jameswright.xyz












On Mon, Apr 17, 2023 at 12:42 PM Blaise Bourdin  wrote:



Hi,


If you have created your vectors using sections, all you need is to call DMCreateSuperDM. See src/dm/impls/plex/tests/ex26.c:300 for an example.


Blaise




On Apr 17, 2023, at 1:30 PM, James Wright  wrote:



Hello,

I currently have two DMPlex objects, both `DMClone`ed from the same "original" DMPlex object. They have different fields with different numbers of components on each of them. I would like to compose certain components from each into a single contiguous array.
 I'd also like to do this from a DMGlobal vector. What is the best/most robust way to do that?

Obviously I can just `VecGetArrayRead` each of the vectors and loop through them manually, but that relies on knowledge of the array data layout. There's `DMPlexGetLocalOffsets`, but since I want to use the global Vec, there isn't a corresponding `DMPlexGetGlobalOffsets`
 (that I can see anyways). This manually looping would also require that the underlying arrangement of the DOFs is the same between the two DMPlex objects, but I assume this is true from the `DMClone` operation.

Thanks,







James Wright
Graduate Research Assistant, PhD
University of Colorado Boulder

Cell: (864) 498 8869 
Email:
ja...@jameswright.xyz
Website:
jameswright.xyz





























— 
Canada Research Chair in Mathematical and Computational Aspects of Solid Mechanics (Tier 1)
Professor, Department of Mathematics & Statistics
Hamilton Hall room 409A, McMaster University
1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243








































— 
Canada Research Chair in Mathematical and Computational Aspects of Solid Mechanics (Tier 1)
Professor, Department of Mathematics & Statistics
Hamilton Hall room 409A, McMaster University
1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243






















Re: [petsc-users] PCHPDDM and matrix type

2023-04-17 Thread Matthew Knepley
Yes, I cannot figure the error out. We will wait for Pierre to weigh in.

  Thanks,

 Matt

On Mon, Apr 17, 2023 at 7:26 PM Alexander Lindsay 
wrote:

> I don't really get much more of a stack trace out:
>
> [0]PETSC ERROR: [1]PETSC ERROR: - Error Message
> --
> [0]PETSC ERROR: Invalid argument
> [0]PETSC ERROR: - Error Message
> --
>
> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
> be the program crashed before they were used or a spelling mistake, etc!
> [1]PETSC ERROR: Invalid argument
> [1]PETSC ERROR:
> [0]PETSC ERROR:   Option left: name:-i value: full_upwinding_2D.i source:
> command line
> [0]PETSC ERROR: [1]PETSC ERROR: WARNING! There are option(s) set that were
> not used! Could be the program crashed before they were used or a spelling
> mistake, etc!
>   Option left: name:-ksp_converged_reason value: ::failed source: code
> [0]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_mat_type value: baij
> source: command line
> [0]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_pc_type value: lu
> source: command line
> [1]PETSC ERROR:   Option left: name:-i value: full_upwinding_2D.i source:
> command line
> [1]PETSC ERROR:   Option left: name:-ksp_converged_reason value: ::failed
> source: code
> [0]PETSC ERROR:   Option left: name:-snes_converged_reason value: ::failed
> source: code
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [0]PETSC ERROR: Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d
>  GIT Date: 2023-04-16 17:35:24 +
> [1]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_mat_type value: baij
> source: command line
> [1]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_pc_type value: lu
> source: command line
> [0]PETSC ERROR: ../../../moose_test-opt on a arch-moose named
> rod.hpc.inl.gov by lindad Mon Apr 17 16:11:09 2023
> [0]PETSC ERROR: Configure options --download-hypre=1
> --with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0
>   --with-debugging=no --download-fblaslapack=1 --download-metis=1
> --download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1
> --download-mumps=1 --download-strumpack=1 --download-scalapack=1
> --download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11
> --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices
>  --with-make-np=256 --download-hpddm
> [1]PETSC ERROR:   Option left: name:-snes_converged_reason value: ::failed
> source: code
> [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [1]PETSC ERROR: [0]PETSC ERROR: #1 buildTwo() at
> /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012
> Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d  GIT Date:
> 2023-04-16 17:35:24 +
> [1]PETSC ERROR: ../../../moose_test-opt on a arch-moose named
> rod.hpc.inl.gov by lindad Mon Apr 17 16:11:09 2023
> [1]PETSC ERROR: Configure options --download-hypre=1
> --with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0
>   --with-debugging=no --download-fblaslapack=1 --download-metis=1
> --download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1
> --download-mumps=1 --download-strumpack=1 --download-scalapack=1
> --download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11
> --with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices
>  --with-make-np=256 --download-hpddm
> [1]PETSC ERROR: #1 buildTwo() at
> /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012
>
> On Mon, Apr 17, 2023 at 4:55 PM Matthew Knepley  wrote:
>
>> I don't think so. Can you show the whole stack?
>>
>>   THanks,
>>
>> Matt
>>
>> On Mon, Apr 17, 2023 at 6:24 PM Alexander Lindsay <
>> alexlindsay...@gmail.com> wrote:
>>
>>> If it helps: if I use those exact same options in serial, then no errors
>>> and the linear solve is beautiful :-)
>>>
>>> On Mon, Apr 17, 2023 at 4:22 PM Alexander Lindsay <
>>> alexlindsay...@gmail.com> wrote:
>>>
 I'm likely revealing a lot of ignorance, but in order to use HPDDM as a
 preconditioner does my system matrix (I am using the same matrix for A and
 P) need to be block type, e.g. baij or sbaij ? In MOOSE our default is aij
 and I am currently getting

 [1]PETSC ERROR: #1 buildTwo() at
 /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012

 with options:

 -pc_type hpddm -pc_hpddm_block_splitting -pc_hpddm_coarse_mat_type baij
 -pc_hpddm_coarse_pc_type lu -pc_hpddm_define_subdomains
 -pc_hpddm_levels_1_eps_gen_non_hermitian -pc_hpddm_levels_1_eps_nev 50
 -pc_hpddm_levels_1_eps_threshold 0.1 -pc_hpddm_levels_1_st_matstructure
 SAME -pc_hpddm_levels_1_st_share_sub_ksp
 -pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps
 -pc_hpddm_levels_1_sub_pc_type lu

 Alex

>>>
>>
>> --

Re: [petsc-users] PCHPDDM and matrix type

2023-04-17 Thread Alexander Lindsay
I don't really get much more of a stack trace out:

[0]PETSC ERROR: [1]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: Invalid argument
[0]PETSC ERROR: - Error Message
--

[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
be the program crashed before they were used or a spelling mistake, etc!
[1]PETSC ERROR: Invalid argument
[1]PETSC ERROR:
[0]PETSC ERROR:   Option left: name:-i value: full_upwinding_2D.i source:
command line
[0]PETSC ERROR: [1]PETSC ERROR: WARNING! There are option(s) set that were
not used! Could be the program crashed before they were used or a spelling
mistake, etc!
  Option left: name:-ksp_converged_reason value: ::failed source: code
[0]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_mat_type value: baij
source: command line
[0]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_pc_type value: lu
source: command line
[1]PETSC ERROR:   Option left: name:-i value: full_upwinding_2D.i source:
command line
[1]PETSC ERROR:   Option left: name:-ksp_converged_reason value: ::failed
source: code
[0]PETSC ERROR:   Option left: name:-snes_converged_reason value: ::failed
source: code
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d
 GIT Date: 2023-04-16 17:35:24 +
[1]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_mat_type value: baij
source: command line
[1]PETSC ERROR:   Option left: name:-pc_hpddm_coarse_pc_type value: lu
source: command line
[0]PETSC ERROR: ../../../moose_test-opt on a arch-moose named
rod.hpc.inl.gov by lindad Mon Apr 17 16:11:09 2023
[0]PETSC ERROR: Configure options --download-hypre=1
--with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0
  --with-debugging=no --download-fblaslapack=1 --download-metis=1
--download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1
--download-mumps=1 --download-strumpack=1 --download-scalapack=1
--download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11
--with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices
 --with-make-np=256 --download-hpddm
[1]PETSC ERROR:   Option left: name:-snes_converged_reason value: ::failed
source: code
[1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[1]PETSC ERROR: [0]PETSC ERROR: #1 buildTwo() at
/raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012
Petsc Development GIT revision: v3.17.4-3368-g5a48edb989d  GIT Date:
2023-04-16 17:35:24 +
[1]PETSC ERROR: ../../../moose_test-opt on a arch-moose named
rod.hpc.inl.gov by lindad Mon Apr 17 16:11:09 2023
[1]PETSC ERROR: Configure options --download-hypre=1
--with-shared-libraries=1 --download-hdf5=1 --with-hdf5-fortran-bindings=0
  --with-debugging=no --download-fblaslapack=1 --download-metis=1
--download-ptscotch=1 --download-parmetis=1 --download-superlu_dist=1
--download-mumps=1 --download-strumpack=1 --download-scalapack=1
--download-slepc=1 --with-mpi=1 --with-openmp=1 --with-cxx-dialect=C++11
--with-fortran-bindings=0 --with-sowing=0 --with-64-bit-indices
 --with-make-np=256 --download-hpddm
[1]PETSC ERROR: #1 buildTwo() at
/raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012

On Mon, Apr 17, 2023 at 4:55 PM Matthew Knepley  wrote:

> I don't think so. Can you show the whole stack?
>
>   THanks,
>
> Matt
>
> On Mon, Apr 17, 2023 at 6:24 PM Alexander Lindsay <
> alexlindsay...@gmail.com> wrote:
>
>> If it helps: if I use those exact same options in serial, then no errors
>> and the linear solve is beautiful :-)
>>
>> On Mon, Apr 17, 2023 at 4:22 PM Alexander Lindsay <
>> alexlindsay...@gmail.com> wrote:
>>
>>> I'm likely revealing a lot of ignorance, but in order to use HPDDM as a
>>> preconditioner does my system matrix (I am using the same matrix for A and
>>> P) need to be block type, e.g. baij or sbaij ? In MOOSE our default is aij
>>> and I am currently getting
>>>
>>> [1]PETSC ERROR: #1 buildTwo() at
>>> /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012
>>>
>>> with options:
>>>
>>> -pc_type hpddm -pc_hpddm_block_splitting -pc_hpddm_coarse_mat_type baij
>>> -pc_hpddm_coarse_pc_type lu -pc_hpddm_define_subdomains
>>> -pc_hpddm_levels_1_eps_gen_non_hermitian -pc_hpddm_levels_1_eps_nev 50
>>> -pc_hpddm_levels_1_eps_threshold 0.1 -pc_hpddm_levels_1_st_matstructure
>>> SAME -pc_hpddm_levels_1_st_share_sub_ksp
>>> -pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps
>>> -pc_hpddm_levels_1_sub_pc_type lu
>>>
>>> Alex
>>>
>>
>
> --
> What most experimenters take for granted before they begin their
> experiments is infinitely more interesting than any results to which their
> experiments lead.
> -- Norbert Wiener
>
> https://www.cse.buffalo.edu/~knepley/
> 
>


Re: [petsc-users] PCHPDDM and matrix type

2023-04-17 Thread Matthew Knepley
I don't think so. Can you show the whole stack?

  THanks,

Matt

On Mon, Apr 17, 2023 at 6:24 PM Alexander Lindsay 
wrote:

> If it helps: if I use those exact same options in serial, then no errors
> and the linear solve is beautiful :-)
>
> On Mon, Apr 17, 2023 at 4:22 PM Alexander Lindsay <
> alexlindsay...@gmail.com> wrote:
>
>> I'm likely revealing a lot of ignorance, but in order to use HPDDM as a
>> preconditioner does my system matrix (I am using the same matrix for A and
>> P) need to be block type, e.g. baij or sbaij ? In MOOSE our default is aij
>> and I am currently getting
>>
>> [1]PETSC ERROR: #1 buildTwo() at
>> /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012
>>
>> with options:
>>
>> -pc_type hpddm -pc_hpddm_block_splitting -pc_hpddm_coarse_mat_type baij
>> -pc_hpddm_coarse_pc_type lu -pc_hpddm_define_subdomains
>> -pc_hpddm_levels_1_eps_gen_non_hermitian -pc_hpddm_levels_1_eps_nev 50
>> -pc_hpddm_levels_1_eps_threshold 0.1 -pc_hpddm_levels_1_st_matstructure
>> SAME -pc_hpddm_levels_1_st_share_sub_ksp
>> -pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps
>> -pc_hpddm_levels_1_sub_pc_type lu
>>
>> Alex
>>
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] PCHPDDM and matrix type

2023-04-17 Thread Alexander Lindsay
If it helps: if I use those exact same options in serial, then no errors
and the linear solve is beautiful :-)

On Mon, Apr 17, 2023 at 4:22 PM Alexander Lindsay 
wrote:

> I'm likely revealing a lot of ignorance, but in order to use HPDDM as a
> preconditioner does my system matrix (I am using the same matrix for A and
> P) need to be block type, e.g. baij or sbaij ? In MOOSE our default is aij
> and I am currently getting
>
> [1]PETSC ERROR: #1 buildTwo() at
> /raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012
>
> with options:
>
> -pc_type hpddm -pc_hpddm_block_splitting -pc_hpddm_coarse_mat_type baij
> -pc_hpddm_coarse_pc_type lu -pc_hpddm_define_subdomains
> -pc_hpddm_levels_1_eps_gen_non_hermitian -pc_hpddm_levels_1_eps_nev 50
> -pc_hpddm_levels_1_eps_threshold 0.1 -pc_hpddm_levels_1_st_matstructure
> SAME -pc_hpddm_levels_1_st_share_sub_ksp
> -pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps
> -pc_hpddm_levels_1_sub_pc_type lu
>
> Alex
>


[petsc-users] PCHPDDM and matrix type

2023-04-17 Thread Alexander Lindsay
I'm likely revealing a lot of ignorance, but in order to use HPDDM as a
preconditioner does my system matrix (I am using the same matrix for A and
P) need to be block type, e.g. baij or sbaij ? In MOOSE our default is aij
and I am currently getting

[1]PETSC ERROR: #1 buildTwo() at
/raid/lindad/moose/petsc/arch-moose/include/HPDDM_schwarz.hpp:1012

with options:

-pc_type hpddm -pc_hpddm_block_splitting -pc_hpddm_coarse_mat_type baij
-pc_hpddm_coarse_pc_type lu -pc_hpddm_define_subdomains
-pc_hpddm_levels_1_eps_gen_non_hermitian -pc_hpddm_levels_1_eps_nev 50
-pc_hpddm_levels_1_eps_threshold 0.1 -pc_hpddm_levels_1_st_matstructure
SAME -pc_hpddm_levels_1_st_share_sub_ksp
-pc_hpddm_levels_1_sub_pc_factor_mat_solver_type mumps
-pc_hpddm_levels_1_sub_pc_type lu

Alex


Re: [petsc-users] Composing different Field Components into single Vector (or data array)

2023-04-17 Thread Blaise Bourdin



Hi,


If you have created your vectors using sections, all you need is to call DMCreateSuperDM. See src/dm/impls/plex/tests/ex26.c:300 for an example.


Blaise




On Apr 17, 2023, at 1:30 PM, James Wright  wrote:



Hello,

I currently have two DMPlex objects, both `DMClone`ed from the same "original" DMPlex object. They have different fields with different numbers of components on each of them. I would like to compose certain components from each into a single contiguous array.
 I'd also like to do this from a DMGlobal vector. What is the best/most robust way to do that?

Obviously I can just `VecGetArrayRead` each of the vectors and loop through them manually, but that relies on knowledge of the array data layout. There's `DMPlexGetLocalOffsets`, but since I want to use the global Vec, there isn't a corresponding `DMPlexGetGlobalOffsets`
 (that I can see anyways). This manually looping would also require that the underlying arrangement of the DOFs is the same between the two DMPlex objects, but I assume this is true from the `DMClone` operation.

Thanks,







James Wright
Graduate Research Assistant, PhD
University of Colorado Boulder

Cell: (864) 498 8869 
Email:
ja...@jameswright.xyz
Website:
jameswright.xyz





























— 
Canada Research Chair in Mathematical and Computational Aspects of Solid Mechanics (Tier 1)
Professor, Department of Mathematics & Statistics
Hamilton Hall room 409A, McMaster University
1280 Main Street West, Hamilton, Ontario L8S 4K1, Canada 
https://www.math.mcmaster.ca/bourdin | +1 (905) 525 9140 ext. 27243





















Re: [petsc-users] PETSc error only in debug build

2023-04-17 Thread Matteo Semplice
Adding PetscFunctionBeginUser indeed seems to fix this and moves the 
error to the next function without PetscFunctionBeginUser...


Thanks!

Matteo

Il 17/04/23 18:27, Pierre Jolivet ha scritto:



Non si ricevono spesso messaggi di posta elettronica da 
pierre.joli...@lip6.fr. Informazioni sul perché è importante 






On 17 Apr 2023, at 6:22 PM, Matteo Semplice 
 wrote:


Dear PETSc users,

    I am investigating a strange error occurring when using my code 
on a cluster; I managed to reproduce it on my machine as well and 
it's weird:


- on petsc3.19, optimized build, the code runs fine, serial and parallel

- on petsc 3,19, --with=debugging=1, the code crashes without giving 
me a meaningful message. The output is


$ ../levelSet -options_file ../test.opts
Converting from ../pointClouds/2d/ptCloud_cerchio.txt in binary 
format: this is slow!

Pass in the .info file instead!
Read 50 particles from ../pointClouds/2d/ptCloud_cerchio.txt
Bounding box: [-0.665297, 0.67] x [-0.666324, 0.666324]
[0]PETSC ERROR: - Error Message 
--

[0]PETSC ERROR: Petsc has generated inconsistent data
[0]PETSC ERROR: Invalid stack size 0, pop convertCloudTxt 
clouds.cpp:139.


[0]PETSC ERROR: WARNING! There are option(s) set that were not used! 
Could be the program crashed before they were used or a spell

ing mistake, etc!
[0]PETSC ERROR:   Option left: name:-delta value: 1.0 source: file
[0]PETSC ERROR:   Option left: name:-dx value: 0.1 source: file
[0]PETSC ERROR:   Option left: name:-extraCells value: 5 source: file
[0]PETSC ERROR:   Option left: name:-maxIter value: 200 source: file
[0]PETSC ERROR:   Option left: name:-p value: 1.0 source: file
[0]PETSC ERROR:   Option left: name:-tau value: 0.1 source: file
[0]PETSC ERROR:   Option left: name:-u0tresh value: 0.3 source: file
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.19.0, unknown
[0]PETSC ERROR: ../levelSet on a  named signalkuppe by matteo Mon Apr 
17 18:04:03 2023
[0]PETSC ERROR: Configure options --download-ml \ --with-metis 
--with-parmetis \ --download-hdf5 \ --with-triangle --with-gmsh \ P
ETSC_DIR=/home/matteo/software/petsc --PETSC_ARCH=dbg 
--with-debugging=1 --COPTFLAGS=-O --CXXOPTFLAGS=-O --FOPTFLAGS=-O 
--prefix=/

home/matteo/software/petsc/3.19-dbg/
[0]PETSC ERROR: #1 convertCloudTxt() at clouds.cpp:139
-- 


MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
with errorcode 77.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--

Now, line 139 of clouds.cpp is PetscFunctionReturn(PETSC_SUCCESS), so 
I cannot understand what is the offending operation in that routine. 
(Note: this is a convertion routine and, skipping it, just make the 
next routine fail in a similar way...)


My student has also tried to compile PETSc with 
|--with-strict-petscerrorcode| and fixing all the compilation errors 
that were raised, but it didn't help.


Do you have any guess on what to look for?

There may be a PetscFunctionBeginUser; missing at the beginning of the 
convertCloudTxt() function.

Could you double-check this?

Thanks,
Pierre


Bonus question to assess the cluster output what is the default value 
for --with-debugging? I that option is not specified during PETSc 
configure, does one get optimized or debug build?


Thanks

    Matteo

--
Professore Associato in Analisi Numerica
Dipartimento di Scienza e Alta Tecnologia
Università degli Studi dell'Insubria
Via Valleggio, 11 - Como



--
---
Professore Associato in Analisi Numerica
Dipartimento di Scienza e Alta Tecnologia
Università degli Studi dell'Insubria
Via Valleggio, 11 - Como


Re: [petsc-users] PETSc error only in debug build

2023-04-17 Thread Pierre Jolivet


> On 17 Apr 2023, at 6:22 PM, Matteo Semplice  
> wrote:
> 
> Dear PETSc users,
> 
> I am investigating a strange error occurring when using my code on a 
> cluster; I managed to reproduce it on my machine as well and it's weird:
> 
> - on petsc3.19, optimized build, the code runs fine, serial and parallel
> 
> - on petsc 3,19, --with=debugging=1, the code crashes without giving me a 
> meaningful message. The output is
> 
> $ ../levelSet -options_file ../test.opts  
> Converting from ../pointClouds/2d/ptCloud_cerchio.txt in binary format: this 
> is slow! 
> Pass in the .info file instead! 
> Read 50 particles from ../pointClouds/2d/ptCloud_cerchio.txt 
> Bounding box: [-0.665297, 0.67] x [-0.666324, 0.666324] 
> [0]PETSC ERROR: - Error Message 
> -- 
> [0]PETSC ERROR: Petsc has generated inconsistent data 
> [0]PETSC ERROR: Invalid stack size 0, pop convertCloudTxt clouds.cpp:139. 
> 
> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could be 
> the program crashed before they were used or a spell
> ing mistake, etc! 
> [0]PETSC ERROR:   Option left: name:-delta value: 1.0 source: file 
> [0]PETSC ERROR:   Option left: name:-dx value: 0.1 source: file 
> [0]PETSC ERROR:   Option left: name:-extraCells value: 5 source: file 
> [0]PETSC ERROR:   Option left: name:-maxIter value: 200 source: file 
> [0]PETSC ERROR:   Option left: name:-p value: 1.0 source: file 
> [0]PETSC ERROR:   Option left: name:-tau value: 0.1 source: file 
> [0]PETSC ERROR:   Option left: name:-u0tresh value: 0.3 source: file 
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting. 
> [0]PETSC ERROR: Petsc Release Version 3.19.0, unknown  
> [0]PETSC ERROR: ../levelSet on a  named signalkuppe by matteo Mon Apr 17 
> 18:04:03 2023 
> [0]PETSC ERROR: Configure options --download-ml \ --with-metis 
> --with-parmetis \ --download-hdf5 \ --with-triangle --with-gmsh \ P
> ETSC_DIR=/home/matteo/software/petsc --PETSC_ARCH=dbg --with-debugging=1 
> --COPTFLAGS=-O --CXXOPTFLAGS=-O --FOPTFLAGS=-O --prefix=/
> home/matteo/software/petsc/3.19-dbg/ 
> [0]PETSC ERROR: #1 convertCloudTxt() at clouds.cpp:139 
> -- 
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF 
> with errorcode 77. 
> 
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. 
> You may or may not see output from other processes, depending on 
> exactly when Open MPI kills them. 
> --
> 
> Now, line 139 of clouds.cpp is PetscFunctionReturn(PETSC_SUCCESS), so I 
> cannot understand what is the offending operation in that routine. (Note: 
> this is a convertion routine and, skipping it, just make the next routine 
> fail in a similar way...)
> 
> My student has also tried to compile PETSc with --with-strict-petscerrorcode 
> and fixing all the compilation errors that were raised, but it didn't help.
> 
> Do you have any guess on what to look for?
> 
There may be a PetscFunctionBeginUser; missing at the beginning of the 
convertCloudTxt() function.
Could you double-check this?

Thanks,
Pierre
> Bonus question to assess the cluster output what is the default value for 
> --with-debugging? I that option is not specified during PETSc configure, does 
> one get optimized or debug build?
> 
> Thanks
> 
> Matteo
> 
> -- 
> Professore Associato in Analisi Numerica
> Dipartimento di Scienza e Alta Tecnologia
> Università degli Studi dell'Insubria
> Via Valleggio, 11 - Como



[petsc-users] PETSc error only in debug build

2023-04-17 Thread Matteo Semplice

Dear PETSc users,

    I am investigating a strange error occurring when using my code on 
a cluster; I managed to reproduce it on my machine as well and it's weird:


- on petsc3.19, optimized build, the code runs fine, serial and parallel

- on petsc 3,19, --with=debugging=1, the code crashes without giving me 
a meaningful message. The output is


$ ../levelSet -options_file ../test.opts
Converting from ../pointClouds/2d/ptCloud_cerchio.txt in binary format: 
this is slow!

Pass in the .info file instead!
Read 50 particles from ../pointClouds/2d/ptCloud_cerchio.txt
Bounding box: [-0.665297, 0.67] x [-0.666324, 0.666324]
[0]PETSC ERROR: - Error Message 
--

[0]PETSC ERROR: Petsc has generated inconsistent data
[0]PETSC ERROR: Invalid stack size 0, pop convertCloudTxt clouds.cpp:139.

[0]PETSC ERROR: WARNING! There are option(s) set that were not used! 
Could be the program crashed before they were used or a spell

ing mistake, etc!
[0]PETSC ERROR:   Option left: name:-delta value: 1.0 source: file
[0]PETSC ERROR:   Option left: name:-dx value: 0.1 source: file
[0]PETSC ERROR:   Option left: name:-extraCells value: 5 source: file
[0]PETSC ERROR:   Option left: name:-maxIter value: 200 source: file
[0]PETSC ERROR:   Option left: name:-p value: 1.0 source: file
[0]PETSC ERROR:   Option left: name:-tau value: 0.1 source: file
[0]PETSC ERROR:   Option left: name:-u0tresh value: 0.3 source: file
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.19.0, unknown
[0]PETSC ERROR: ../levelSet on a  named signalkuppe by matteo Mon Apr 17 
18:04:03 2023
[0]PETSC ERROR: Configure options --download-ml \ --with-metis 
--with-parmetis \ --download-hdf5 \ --with-triangle --with-gmsh \ P
ETSC_DIR=/home/matteo/software/petsc --PETSC_ARCH=dbg --with-debugging=1 
--COPTFLAGS=-O --CXXOPTFLAGS=-O --FOPTFLAGS=-O --prefix=/

home/matteo/software/petsc/3.19-dbg/
[0]PETSC ERROR: #1 convertCloudTxt() at clouds.cpp:139
--
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
with errorcode 77.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--

Now, line 139 of clouds.cpp is PetscFunctionReturn(PETSC_SUCCESS), so I 
cannot understand what is the offending operation in that routine. 
(Note: this is a convertion routine and, skipping it, just make the next 
routine fail in a similar way...)


My student has also tried to compile PETSc with 
|--with-strict-petscerrorcode| and fixing all the compilation errors 
that were raised, but it didn't help.


Do you have any guess on what to look for?

Bonus question to assess the cluster output what is the default value 
for --with-debugging? I that option is not specified during PETSc 
configure, does one get optimized or debug build?


Thanks

    Matteo

--
Professore Associato in Analisi Numerica
Dipartimento di Scienza e Alta Tecnologia
Università degli Studi dell'Insubria
Via Valleggio, 11 - Como


Re: [petsc-users] Using nonzero -pc_hypre_boomeramg_restriction_type in field split

2023-04-17 Thread Alexander Lindsay
Good to know. I may take a shot at it depending on need and time! Opened
https://gitlab.com/petsc/petsc/-/issues/1362 for doing so

Alex

On Sun, Apr 16, 2023 at 9:27 PM Pierre Jolivet 
wrote:

>
> On 17 Apr 2023, at 1:10 AM, Alexander Lindsay 
> wrote:
>
> Are there any plans to get the missing hook into PETSc for AIR? Just
> curious if there’s an issue I can subscribe to or anything.
>
>
> Not that I know of, but it would make for a nice contribution if you feel
> like creating a PR.
>
> Thanks,
> Pierre
>
> (Independently I’m excited to test HPDDM out tomorrow)
>
> On Apr 13, 2023, at 10:29 PM, Pierre Jolivet 
> wrote:
>
> 
>
> On 14 Apr 2023, at 7:02 AM, Alexander Lindsay 
> wrote:
>
> Pierre,
>
> This is very helpful information. Thank you. Yes I would appreciate those
> command line options if you’re willing to share!
>
>
> No problem, I’ll get in touch with you in private first, because it may
> require some extra work (need a couple of extra options in PETSc
> ./configure), and this is not very related to the problem at hand, so best
> not to spam the mailing list.
>
> Thanks,
> Pierre
>
> On Apr 13, 2023, at 9:54 PM, Pierre Jolivet 
> wrote:
>
> 
>
> On 13 Apr 2023, at 10:33 PM, Alexander Lindsay 
> wrote:
>
> Hi, I'm trying to solve steady Navier-Stokes for different Reynolds
> numbers. My options table
>
> -dm_moose_fieldsplit_names u,p
> -dm_moose_nfieldsplits 2
> -fieldsplit_p_dm_moose_vars pressure
> -fieldsplit_p_ksp_type preonly
> -fieldsplit_p_pc_type jacobi
> -fieldsplit_u_dm_moose_vars vel_x,vel_y
> -fieldsplit_u_ksp_type preonly
> -fieldsplit_u_pc_hypre_type boomeramg
> -fieldsplit_u_pc_type hypre
> -pc_fieldsplit_schur_fact_type full
> -pc_fieldsplit_schur_precondition selfp
> -pc_fieldsplit_type schur
> -pc_type fieldsplit
>
> works wonderfully for a low Reynolds number of 2.2. The solver performance
> crushes LU as I scale up the problem. However, not surprisingly this
> options table struggles when I bump the Reynolds number to 220. I've read
> that use of AIR (approximate ideal restriction) can improve performance for
> advection dominated problems. I've tried
> setting -pc_hypre_boomeramg_restriction_type 1 for a simple diffusion
> problem and the option works fine. However, when applying it to my
> field-split preconditioned Navier-Stokes system, I get immediate
> non-convergence:
>
>  0 Nonlinear |R| = 1.033077e+03
>   0 Linear |R| = 1.033077e+03
>   Linear solve did not converge due to DIVERGED_NANORINF iterations 0
> Nonlinear solve did not converge due to DIVERGED_LINEAR_SOLVE iterations 0
>
> Does anyone have an idea as to why this might be happening?
>
>
> Do not use this option, even when not part of PCFIELDSPLIT.
> There is some missing plumbing in PETSc which makes it unusable, see Ben’s
> comment here
> https://github.com/hypre-space/hypre/issues/764#issuecomment-1353452417.
> In fact, it’s quite easy to make HYPRE generate NaN with a very simple
> stabilized convection—diffusion problem near the pure convection limit
> (something that ℓAIR is supposed to handle).
> Even worse, you can make HYPRE fill your terminal with printf-style
> debugging messages
> https://github.com/hypre-space/hypre/blob/5546cc22d46b3dba253849f258786da47c9a7b21/src/parcsr_ls/par_lr_restr.c#L1416
>  with
> this option turned on.
> As a result, I have been unable to reproduce any of the ℓAIR results.
> This also explains why I have been using plain BoomerAMG instead of ℓAIR
> for the comparison in page 9 of https://arxiv.org/pdf/2201.02250.pdf (if
> you would like to try the PC we are using, I could send you the command
> line options).
>
> Thanks,
> Pierre
>
> If not, I'd take a suggestion on where to set a breakpoint to start my own
> investigation. Alternatively, I welcome other preconditioning suggestions
> for an advection dominated problem.
>
> Alex
>
>
>
>
>


Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-04-17 Thread Matthew Knepley
On Mon, Apr 17, 2023 at 6:37 AM Edoardo alinovi 
wrote:

> Sure thing, the solver I am working on is this one:
> https://gitlab.com/alie89/flubio-code-fvm.
>
> It is a 3D, collocated, unstructured, finite volume solver for
> incompressibility NS. I can run steady, unsteady and I can use SIMPLE, PISO
> and Factional step method (both explicit and fully implicit momentum). I
> can also solve for turbulence (k-omega, BSL, SST, Spalart-Allmars, LES). I
> have also implemented some kind of Immersed boundary (2D/3D) that I need to
> resume at some point.
>
> Hot topic of the moment, I am developing a fully coupled pressure based
> solver using field-split. What I have now is working ok, I have validated
> it on a lot of 2D problems and going on with 3D right now.  If all the
> tests are passed, I'll focus on tuning the field splitting which looks to
> be a quite interesting topic!
>

I think a very good discussion of the issues from the point of view of FEM
is

   https://arxiv.org/abs/1810.03315

There should be a similar analysis from the FVM side, although it might not
be possible to
find a pressure discretization compatible with the FVM velocity for this
purpose.

  Thanks,

 Matt


> Flubio is a project I have been carrying on since PhD days. The
> implementation is 99% on my shoulders, despite the fact I am
> collaborating with some people around. I am coding evenings and
> weekends/free time, it gives me a lot of satisfaction and also a lot of
> insights!
>

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-04-17 Thread Edoardo alinovi
Sure thing, the solver I am working on is this one:
https://gitlab.com/alie89/flubio-code-fvm.

It is a 3D, collocated, unstructured, finite volume solver for
incompressibility NS. I can run steady, unsteady and I can use SIMPLE, PISO
and Factional step method (both explicit and fully implicit momentum). I
can also solve for turbulence (k-omega, BSL, SST, Spalart-Allmars, LES). I
have also implemented some kind of Immersed boundary (2D/3D) that I need to
resume at some point.

Hot topic of the moment, I am developing a fully coupled pressure based
solver using field-split. What I have now is working ok, I have validated
it on a lot of 2D problems and going on with 3D right now.  If all the
tests are passed, I'll focus on tuning the field splitting which looks to
be a quite interesting topic!

Flubio is a project I have been carrying on since PhD days. The
implementation is 99% on my shoulders, despite the fact I am
collaborating with some people around. I am coding evenings and
weekends/free time, it gives me a lot of satisfaction and also a lot of
insights!


Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-04-17 Thread Matthew Knepley
On Mon, Apr 17, 2023 at 6:16 AM Edoardo alinovi 
wrote:

> Do you mean the solver I am messing around? XD
>

Yes, and what physics it is targeting.

  THanks,

Matt
-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-04-17 Thread Edoardo alinovi
Do you mean the solver I am messing around? XD


Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-04-17 Thread Matthew Knepley
On Mon, Apr 17, 2023 at 6:09 AM Edoardo alinovi 
wrote:

> Thanks Matt, your always there when you need <3
>

Glad it's working! Sometime you have to tell me what it is solving.

  Thanks,

 Matt

-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-04-17 Thread Edoardo alinovi
Thanks Matt, your always there when you need <3


Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-04-17 Thread Edoardo alinovi
Aaah yes you are right. Do not ask me why, but I was not getting this with
3.18.5, odd.


Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-04-17 Thread Matthew Knepley
On Mon, Apr 17, 2023 at 6:00 AM Edoardo alinovi 
wrote:

> Hey Matt,
>
> Thanks for the help. Here is the error:
>
> [0]PETSC ERROR: - Error Message
> --
> [0]PETSC ERROR: Object is in wrong state
> [0]PETSC ERROR: Not for unassembled vector, did you call
> VecAssemblyBegin()/VecAssemblyEnd()?
> [0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
> be the program crashed before they were used or a spelling mistake, etc!
> [0]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value:
> 0.1E-0001 source: code
> [0]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_pc_type value:
> hypre source: code
> [0]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value:
> 0.1E-0001 source: code
> [0]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_pc_type value:
> bjacobi source: code
> [0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [0]PETSC ERROR: Petsc Release Version 3.19.0, unknown
> [0]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon
> Apr 17 12:05:28 2023
> [0]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3
> COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no
> -with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build
> -download-fblaslapack=1 -download-superlu_dist -download-mumps
> -download-hypre -download-metis -download-parmetis -download-scalapack
> --download-ml -download-slepc -download-spai -download-fftw
> [0]PETSC ERROR: #1 MatMult() at
> /home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557
> [1]PETSC ERROR: - Error Message
> --
> [1]PETSC ERROR: Object is in wrong state
> [1]PETSC ERROR: Not for unassembled vector, did you call
> VecAssemblyBegin()/VecAssemblyEnd()?
> [1]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
> be the program crashed before they were used or a spelling mistake, etc!
> [1]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value:
> 0.1E-0001 source: code
> [1]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_pc_type value:
> hypre source: code
> [1]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value:
> 0.1E-0001 source: code
> [1]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_pc_type value:
> bjacobi source: code
> [1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
> [1]PETSC ERROR: Petsc Release Version 3.19.0, unknown
> [1]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon
> Apr 17 12:05:28 2023
> [1]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3
> COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no
> -with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build
> -download-fblaslapack=1 -download-superlu_dist -download-mumps
> -download-hypre -download-metis -download-parmetis -download-scalapack
> --download-ml -download-slepc -download-spai -download-fftw
> [1]PETSC ERROR: #1 MatMult() at
> /home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557
> [2]PETSC ERROR: - Error Message
> --
> [2]PETSC ERROR: Object is in wrong state
> [2]PETSC ERROR: Not for unassembled vector, did you call
> VecAssemblyBegin()/VecAssemblyEnd()?
> [2]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
> be the program crashed before they were used or a spelling mistake, etc!
> [2]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value:
> 0.1E-0001 source: code
> [2]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_pc_type value:
> hypre source: code
> [2]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value:
> 0.1E-0001 source: code
> [2]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_pc_type value:
> bjacobi source: code
> [2]PETSC ERROR: [3]PETSC ERROR: - Error Message
> --
> [3]PETSC ERROR: Object is in wrong state
> [3]PETSC ERROR: Not for unassembled vector, did you call
> VecAssemblyBegin()/VecAssemblyEnd()?
> [3]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
> be the program crashed before they were used or a spelling mistake, etc!
> [3]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value:
> 0.1E-0001 source: code
> [3]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_pc_type value:
> hypre source: code
> [3]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value:
> 0.1E-0001 source: code
> [3]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_pc_type value:
> bjacobi source: code
> See https://petsc.org/release/faq/ for trouble shooting.
> [2]PETSC ERROR: 

Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-04-17 Thread Edoardo alinovi
Hey Matt,

Thanks for the help. Here is the error:

[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Not for unassembled vector, did you call
VecAssemblyBegin()/VecAssemblyEnd()?
[0]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
be the program crashed before they were used or a spelling mistake, etc!
[0]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value:
0.1E-0001 source: code
[0]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_pc_type value:
hypre source: code
[0]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value:
0.1E-0001 source: code
[0]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_pc_type value:
bjacobi source: code
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.19.0, unknown
[0]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon
Apr 17 12:05:28 2023
[0]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3
COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no
-with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build
-download-fblaslapack=1 -download-superlu_dist -download-mumps
-download-hypre -download-metis -download-parmetis -download-scalapack
--download-ml -download-slepc -download-spai -download-fftw
[0]PETSC ERROR: #1 MatMult() at
/home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557
[1]PETSC ERROR: - Error Message
--
[1]PETSC ERROR: Object is in wrong state
[1]PETSC ERROR: Not for unassembled vector, did you call
VecAssemblyBegin()/VecAssemblyEnd()?
[1]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
be the program crashed before they were used or a spelling mistake, etc!
[1]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value:
0.1E-0001 source: code
[1]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_pc_type value:
hypre source: code
[1]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value:
0.1E-0001 source: code
[1]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_pc_type value:
bjacobi source: code
[1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.19.0, unknown
[1]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon
Apr 17 12:05:28 2023
[1]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3
COPTFLAGS=-O3 CXXOPTFLAGS=-O3 -with-debugging=no
-with-mpi-dir=/home/edo/user_software_repository/openmpi-4.1.2/build
-download-fblaslapack=1 -download-superlu_dist -download-mumps
-download-hypre -download-metis -download-parmetis -download-scalapack
--download-ml -download-slepc -download-spai -download-fftw
[1]PETSC ERROR: #1 MatMult() at
/home/edo/user_software_repository/petsc/src/mat/interface/matrix.c:2557
[2]PETSC ERROR: - Error Message
--
[2]PETSC ERROR: Object is in wrong state
[2]PETSC ERROR: Not for unassembled vector, did you call
VecAssemblyBegin()/VecAssemblyEnd()?
[2]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
be the program crashed before they were used or a spelling mistake, etc!
[2]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value:
0.1E-0001 source: code
[2]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_pc_type value:
hypre source: code
[2]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value:
0.1E-0001 source: code
[2]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_pc_type value:
bjacobi source: code
[2]PETSC ERROR: [3]PETSC ERROR: - Error Message
--
[3]PETSC ERROR: Object is in wrong state
[3]PETSC ERROR: Not for unassembled vector, did you call
VecAssemblyBegin()/VecAssemblyEnd()?
[3]PETSC ERROR: WARNING! There are option(s) set that were not used! Could
be the program crashed before they were used or a spelling mistake, etc!
[3]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_ksp_rtol value:
0.1E-0001 source: code
[3]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_p_pc_type value:
hypre source: code
[3]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_ksp_rtol value:
0.1E-0001 source: code
[3]PETSC ERROR:   Option left: name:-UPeqn_fieldsplit_u_pc_type value:
bjacobi source: code
See https://petsc.org/release/faq/ for trouble shooting.
[2]PETSC ERROR: Petsc Release Version 3.19.0, unknown
[2]PETSC ERROR: flubio_coupled on a arch-gnu named betelgeuse by edo Mon
Apr 17 12:05:28 2023
[2]PETSC ERROR: Configure options PETSC_ARCH=arch-gnu FOPTFLAGS=-O3
COPTFLAGS=-O3 CXXOPTFLAGS=-O3 

Re: [petsc-users] issues with VecSetValues in petsc 3.19

2023-04-17 Thread Matthew Knepley
On Mon, Apr 17, 2023 at 5:36 AM Edoardo alinovi 
wrote:

> Hello Barry, Matt, Jed,
>
> I have just installed the latest and greatest version of petsc and I am
> hitting a problem I did not have in previous releases.
>
> Here is the error:
>
>
>
>
> *[1]PETSC ERROR: - Error Message
> --[1]PETSC
> ERROR: Object is in wrong state[1]PETSC ERROR: Not for unassembled vector,
> did you call VecAssemblyBegin()/VecAssemblyEnd()?[1]PETSC ERROR: WARNING!
> There are option(s) set that were not used! Could be the program crashed
> before they were used or a spelling mistake, etc!*
>
> the vector x I am filling is created with:
>
>- call VecDuplicate(this%rhs, x, ierr)
>
> rhs is allocated and I called VecAssemblyBegin()/VecAssemblyEnd() on, do I
> need to call it on duplicated vectors as well now on?
>

Only if you change the values. Can you show the entire stack from the error
message?

  Thanks,

 Matt


> Thank you!
>
-- 
What most experimenters take for granted before they begin their
experiments is infinitely more interesting than any results to which their
experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/ 


[petsc-users] issues with VecSetValues in petsc 3.19

2023-04-17 Thread Edoardo alinovi
Hello Barry, Matt, Jed,

I have just installed the latest and greatest version of petsc and I am
hitting a problem I did not have in previous releases.

Here is the error:




*[1]PETSC ERROR: - Error Message
--[1]PETSC
ERROR: Object is in wrong state[1]PETSC ERROR: Not for unassembled vector,
did you call VecAssemblyBegin()/VecAssemblyEnd()?[1]PETSC ERROR: WARNING!
There are option(s) set that were not used! Could be the program crashed
before they were used or a spelling mistake, etc!*

the vector x I am filling is created with:

   - call VecDuplicate(this%rhs, x, ierr)

rhs is allocated and I called VecAssemblyBegin()/VecAssemblyEnd() on, do I
need to call it on duplicated vectors as well now on?

Thank you!