Re: [petsc-users] Implementing periodicity using DMPlex

2020-06-12 Thread Jed Brown
Please always use "reply-all" so that your messages go to the list. This is standard mailing list etiquette. It is important to preserve threading for people who find this discussion later and so that we do not waste our time re-answering the same questions that have already been answered in

Re: [petsc-users] Make stream

2020-06-12 Thread Jed Brown
Jed Brown writes: > Fande Kong writes: > >>> There's a lot more to AMG setup than memory bandwidth (architecture >>> matters a lot, even between different generation CPUs). >> >> >> Could you elaborate a bit more on this? From my understanding, one big part >> of AMG SetUp is RAP that should be

Re: [petsc-users] Direct linear solver in Petsc

2020-06-12 Thread Qin Lu via petsc-users
Thanks Barry! Qin On Friday, June 12, 2020, 6:18 PM, Barry Smith wrote:   For the problem sizes you describe with sparse matrices the PETSc built in one is competitive with the external solvers, sometimes faster, it is not worth using the external solvers for such small problems. The

Re: [petsc-users] Direct linear solver in Petsc

2020-06-12 Thread Mark Adams
On Fri, Jun 12, 2020 at 1:18 PM Matthew Knepley wrote: > On Fri, Jun 12, 2020 at 12:49 PM Qin Lu via petsc-users < > petsc-users@mcs.anl.gov> wrote: > >> Hello, >> >> I plan to solve a small sparse linear equation system using the direct >> solver, since the number of unknowns is small (less

Re: [petsc-users] Direct linear solver in Petsc

2020-06-12 Thread Barry Smith
For the problem sizes you describe with sparse matrices the PETSc built in one is competitive with the external solvers, sometimes faster, it is not worth using the external solvers for such small problems. The external solvers have much more elaborate implements of sparse factorizations and

Re: [petsc-users] Petsc options

2020-06-12 Thread Satish Balay via petsc-users
You can: specify via env variable - before running the appliation: export PETSC_OPTIONS="-info" mpiexec ./binary Or add options to ~/.petscrc file [one option per line] - and then run the code. Satish On Fri, 12 Jun 2020, Y. Shidi wrote: > Dear developers, > > I want to use "-info" to view

[petsc-users] Petsc options

2020-06-12 Thread Y. Shidi
Dear developers, I want to use "-info" to view the preallocation of the matrix. I cannot add it directly when I run the code. I have tried: PetscOptionsInsertString(), PetscOptionsSetValue(). Neither of them works. So I am wondering if there is any other functions. Thank you for your time.

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-12 Thread Danyang Su
Hi Jed, Attached is the example for your test. This example uses H5Sset_none to tell H5Dwrite call that there will be no data. 4-th process HAS to participate we are in a collective mode. The code is ported and modified based on the C example from

Re: [petsc-users] Implementing periodicity using DMPlex

2020-06-12 Thread jed
You need DMGlobalToLocal.VecGhost is a different thing.On Jun 12, 2020 13:17, Shashwat Tiwari wrote:Hi, I am writing a first order 2D solver for unstructured grids with periodic boundaries using DMPlex. After generating the mesh, I use "DMSetPeriodicity" function to set periodicity in both

[petsc-users] Implementing periodicity using DMPlex

2020-06-12 Thread Shashwat Tiwari
Hi, I am writing a first order 2D solver for unstructured grids with periodic boundaries using DMPlex. After generating the mesh, I use "DMSetPeriodicity" function to set periodicity in both directions. After which I partition the mesh (DMPlexDistribute), construct ghost cells

Re: [petsc-users] Direct linear solver in Petsc

2020-06-12 Thread Qin Lu via petsc-users
Thanks Matthew and Mark! Qin On Friday, June 12, 2020, 12:22:08 PM CDT, Mark Adams wrote: On Fri, Jun 12, 2020 at 12:56 PM Qin Lu via petsc-users wrote: Hello, I plan to solve a small sparse linear equation system using the direct solver, since the number of unknowns is small

Re: [petsc-users] Direct linear solver in Petsc

2020-06-12 Thread Mark Adams
On Fri, Jun 12, 2020 at 12:56 PM Qin Lu via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hello, > > I plan to solve a small sparse linear equation system using the direct > solver, since the number of unknowns is small (less than 1000). Here I got > a few questions: > > 1. Is there a general

Re: [petsc-users] Direct linear solver in Petsc

2020-06-12 Thread Matthew Knepley
On Fri, Jun 12, 2020 at 12:49 PM Qin Lu via petsc-users < petsc-users@mcs.anl.gov> wrote: > Hello, > > I plan to solve a small sparse linear equation system using the direct > solver, since the number of unknowns is small (less than 1000). Here I got > a few questions: > > 1. Is there a general

[petsc-users] Direct linear solver in Petsc

2020-06-12 Thread Qin Lu via petsc-users
Hello, I plan to solve a small sparse linear equation system using the direct solver, since the number of unknowns is small (less than 1000). Here I got a few questions: 1. Is there a general guide line on the size of the system that direct solver is more efficient than iterative solver?2. Is

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-12 Thread Jed Brown
Danyang Su writes: > Hi Jed, > > Thanks for your double check. > > The HDF 1.10.6 version also works. But versions from 1.12.x stop working. I'd suggest making a reduced test case in order to submit a bug report. This was the relevant change in PETSc for hdf5-1.12.

Re: [petsc-users] Parallel writing in HDF5-1.12.0 when some processors have no data to write

2020-06-12 Thread Danyang Su
Hi Jed, Thanks for your double check. The HDF 1.10.6 version also works. But versions from 1.12.x stop working. Attached is the code section where I have problem. !c write the dataset collectively !!! CODE