Please always use "reply-all" so that your messages go to the list.
This is standard mailing list etiquette. It is important to preserve
threading for people who find this discussion later and so that we do
not waste our time re-answering the same questions that have already
been answered in
Jed Brown writes:
> Fande Kong writes:
>
>>> There's a lot more to AMG setup than memory bandwidth (architecture
>>> matters a lot, even between different generation CPUs).
>>
>>
>> Could you elaborate a bit more on this? From my understanding, one big part
>> of AMG SetUp is RAP that should be
Thanks Barry!
Qin
On Friday, June 12, 2020, 6:18 PM, Barry Smith wrote:
For the problem sizes you describe with sparse matrices the PETSc built in
one is competitive with the external solvers, sometimes faster, it is not worth
using the external solvers for such small problems. The
On Fri, Jun 12, 2020 at 1:18 PM Matthew Knepley wrote:
> On Fri, Jun 12, 2020 at 12:49 PM Qin Lu via petsc-users <
> petsc-users@mcs.anl.gov> wrote:
>
>> Hello,
>>
>> I plan to solve a small sparse linear equation system using the direct
>> solver, since the number of unknowns is small (less
For the problem sizes you describe with sparse matrices the PETSc built in
one is competitive with the external solvers, sometimes faster, it is not worth
using the external solvers for such small problems. The external solvers have
much more elaborate implements of sparse factorizations and
You can:
specify via env variable - before running the appliation:
export PETSC_OPTIONS="-info"
mpiexec ./binary
Or add options to ~/.petscrc file [one option per line] - and then run the code.
Satish
On Fri, 12 Jun 2020, Y. Shidi wrote:
> Dear developers,
>
> I want to use "-info" to view
Dear developers,
I want to use "-info" to view the preallocation of the matrix.
I cannot add it directly when I run the code.
I have tried:
PetscOptionsInsertString(), PetscOptionsSetValue().
Neither of them works.
So I am wondering if there is any other functions.
Thank you for your time.
Hi Jed,
Attached is the example for your test.
This example uses H5Sset_none to tell H5Dwrite call that there will be no data.
4-th process HAS to participate we are in a collective mode.
The code is ported and modified based on the C example from
You need DMGlobalToLocal.VecGhost is a different thing.On Jun 12, 2020 13:17, Shashwat Tiwari wrote:Hi, I am writing a first order 2D solver for unstructured grids with periodic boundaries using DMPlex. After generating the mesh, I use "DMSetPeriodicity" function to set periodicity in both
Hi,
I am writing a first order 2D solver for unstructured grids with periodic
boundaries using DMPlex. After generating the mesh, I use
"DMSetPeriodicity" function to set periodicity in both directions. After
which I partition the mesh (DMPlexDistribute), construct ghost cells
Thanks Matthew and Mark!
Qin
On Friday, June 12, 2020, 12:22:08 PM CDT, Mark Adams
wrote:
On Fri, Jun 12, 2020 at 12:56 PM Qin Lu via petsc-users
wrote:
Hello,
I plan to solve a small sparse linear equation system using the direct solver,
since the number of unknowns is small
On Fri, Jun 12, 2020 at 12:56 PM Qin Lu via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Hello,
>
> I plan to solve a small sparse linear equation system using the direct
> solver, since the number of unknowns is small (less than 1000). Here I got
> a few questions:
>
> 1. Is there a general
On Fri, Jun 12, 2020 at 12:49 PM Qin Lu via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Hello,
>
> I plan to solve a small sparse linear equation system using the direct
> solver, since the number of unknowns is small (less than 1000). Here I got
> a few questions:
>
> 1. Is there a general
Hello,
I plan to solve a small sparse linear equation system using the direct solver,
since the number of unknowns is small (less than 1000). Here I got a few
questions:
1. Is there a general guide line on the size of the system that direct solver
is more efficient than iterative solver?2. Is
Danyang Su writes:
> Hi Jed,
>
> Thanks for your double check.
>
> The HDF 1.10.6 version also works. But versions from 1.12.x stop working.
I'd suggest making a reduced test case in order to submit a bug report.
This was the relevant change in PETSc for hdf5-1.12.
Hi Jed,
Thanks for your double check.
The HDF 1.10.6 version also works. But versions from 1.12.x stop working.
Attached is the code section where I have problem.
!c write the dataset collectively
!!!
CODE
16 matches
Mail list logo