Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-06-25 Thread Hong
Marius: > Thanks a lot, I did not test it thoroughly but it seems to work well and > really helpful. I have one question, is it necessary to do the MatTranspose > step as in ex214.c, I thought this is handled internally by petsc? > MUMPS requires sparse compressed COLUMN format in the host for

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-06-24 Thread Marius Buerkle
Thanks a lot, I did not test it thoroughly but it seems to work well and really helpful. I have one question, is it necessary to do the MatTranspose step as in ex214.c, I thought this is handled internally by petsc?       Marius: I added support for parallel sparse RHS (in host) for

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-06-20 Thread Hong
Marius: I added support for parallel sparse RHS (in host) for MatMatSolve_MUMPS() https://bitbucket.org/petsc/petsc/commits/2b691707dd0cf456c808def006e14b6f56b364b6 It is in the branch hzhang/mumps-spRHS. You may test it. I'll further cleanup the routine, test it, then merge it to petsc. Hong

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-06-04 Thread Hong
On Mon, Jun 4, 2018 at 1:03 PM, Jean-Yves LExcellent < jean-yves.l.excell...@ens-lyon.fr> wrote: > > Thanks for the details of your needs. > > For the first application, the sparse RHS feature with distributed > solution should effectively be fine. > I'll add parallel support of this feature in

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-06-04 Thread Jean-Yves LExcellent
Thanks for the details of your needs. For the first application, the sparse RHS feature with distributed solution should effectively be fine. For the second one, a future distributed RHS feature (not currently available in MUMPS) might help if the centralized sparse RHS is too memory

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-06-01 Thread Marius Buerkle
and the second application I have in mind is solving a system of the from AX=B where A and B are sparse and B is given by a block matrix of the form B=[B1 0, 0 0] where B1 is dense but the dimension is (much) smaller than that of the whole matrix B.   Marius: Current PETSc interface supports

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-06-01 Thread Marius Buerkle
 I want to  invert a rather large sparse matrix for this using a sparse rhs with centralized input would be ok as long as the solution is distributed.      Marius: Current PETSc interface supports sequential sparse multiple right-hand side, but not distributed. It turns out that mumps does

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-06-01 Thread Hong
Marius: Current PETSc interface supports sequential sparse multiple right-hand side, but not distributed. It turns out that mumps does not support distributed sparse multiple right-hand sides at the moment (see attached email). Jean-Yves invites you to communicate with him directly. Let me know

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-05-31 Thread Marius Buerkle
Thanks a lot guys, very helpful.   I see MUMPS http://mumps.enseeiht.fr/ Sparse multiple right-hand side, distributed solution; Exploitation of sparsity in the right-hand sidesPETSc interface computes mumps distributed solution as default (this is not new) (ICNTL(21) = 1)   I will add

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-05-31 Thread Hong
I see MUMPS http://mumps.enseeiht.fr/ - *Sparse multiple right-hand side, distributed solution*; Exploitation of sparsity in the right-hand sides PETSc interface computes mumps *distributed solution *as default (this is not new) (ICNTL(21) = 1) I will add support for *Sparse multiple

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-05-31 Thread Smith, Barry F.
Hong, Can you see about adding support for distributed right hand side? Thanks Barry > On May 31, 2018, at 2:37 AM, Marius Buerkle wrote: > > The fix for MAT_NEW_NONZERO_LOCATIONS, thanks again. > > I have yet another question, sorry. The recent version of MUMPS supports

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-05-31 Thread Stefano Zampini
The current version of the MUMPS code in PETSc supports sparse right hand sides for sequential solvers. You can call MatMatSolve(A,X,B) with B of type MATTRANSPOSE, with the inner matrix being a MATSEQAIJ 2018-05-31 10:37 GMT+03:00 Marius Buerkle : > The fix for MAT_NEW_NONZERO_LOCATIONS,

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-05-31 Thread Marius Buerkle
The fix for MAT_NEW_NONZERO_LOCATIONS, thanks again. I have yet another question, sorry. The recent version of MUMPS supports distributed and sparse RHS is there any chance that this will be supported in PETSc in the near future?     > On May 30, 2018, at 6:55 PM, Marius Buerkle wrote: > >

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-05-30 Thread Smith, Barry F.
> On May 30, 2018, at 6:55 PM, Marius Buerkle wrote: > > Thanks for the quick fix, I will test it and report back. > I have another maybe related question, if MAT_NEW_NONZERO_LOCATIONS is true > and let's say 1 new nonzero position is created it does not allocated 1 but > several new

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-05-30 Thread Marius Buerkle
Thanks for the quick fix, I will test it and report back. I have another maybe related question, if MAT_NEW_NONZERO_LOCATIONS is true and let's say 1 new nonzero position is created it does not allocated 1 but several new nonzeros but only use 1. I think that is normal, right? But, at least as

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-05-30 Thread Smith, Barry F.
Fixed in the branch barry/fix-mat-new-nonzero-locations/maint Once this passes testing it will go into the maint branch and then the next patch release but you can use it now in the branch barry/fix-mat-new-nonzero-locations/maint Thanks for the report and reproducible example

Re: [petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-05-29 Thread Smith, Barry F.
Please send complete error message; type of matrix used etc. Ideally code that demonstrates the problem. Barry > On May 29, 2018, at 3:31 AM, Marius Buerkle wrote: > > > Hi, > > I tried to set MAT_NEW_NONZERO_LOCATIONS to false, as far as I understood > MatSetValues should simply

[petsc-users] MAT_NEW_NONZERO_LOCATIONS working?

2018-05-29 Thread Marius Buerkle
Hi,   I tried to set MAT_NEW_NONZERO_LOCATIONS to false, as far as I understood MatSetValues should simply ignore  entries which would give rise to new nonzero values not creating a new entry and not cause an error, but I get "[1]PETSC ERROR: Inserting a new nonzero at global row/column". Is