Re: [petsc-dev] MPI_UB is deprecated in MPI-2.0

2019-03-21 Thread Zhang, Junchao via petsc-dev
I pushed an update to this branch, which adopts MPI_Type_create_resized. --Junchao Zhang On Tue, Mar 19, 2019 at 11:56 AM Balay, Satish via petsc-dev mailto:petsc-dev@mcs.anl.gov>> wrote: For now - I'm merging this branch to next. If better fix comes up later we can merge it then. thanks,

Re: [petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-21 Thread Mark Adams via petsc-dev
> > > Could you explain this more by adding some small examples? > > Since you are considering implementing all-at-once (four nested loops, right?) I'll give you my old code. This code is hardwired for two AMG and for a geometric-AMG, where the blocks of the R (and hence P) matrices are scaled

Re: [petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-21 Thread Fande Kong via petsc-dev
Hi Mark, Thanks for your email. On Thu, Mar 21, 2019 at 6:39 AM Mark Adams via petsc-dev < petsc-dev@mcs.anl.gov> wrote: > I'm probably screwing up some sort of history by jumping into dev, but > this is a dev comment ... > > (1) -matptap_via hypre: This call the hypre package to do the PtAP

Re: [petsc-dev] [petsc-maint] PETSc release by March 29, 2019

2019-03-21 Thread Balay, Satish via petsc-dev
On Tue, 5 Mar 2019, Balay, Satish via petsc-maint wrote: > perhaps starting March 18 - freeze access to next - and keep > recreating next & next-tmp dynamically as needed A note: I've restricted access to 'next' so that the above workflow can be used for the release [if needed]. Satish

Re: [petsc-dev] [petsc-maint] PETSc release by March 29, 2019

2019-03-21 Thread Balay, Satish via petsc-dev
A reminder!. Also please check and update src/docs/website/documentation/changes/dev.html thanks, Satish On Tue, 5 Mar 2019, Balay, Satish via petsc-maint wrote: > Sure - I would add caveats such as: > > - its best to submit PRs early - if they are critical [i.e if the > branch should be in

Re: [petsc-dev] [petsc-users] Bad memory scaling with PETSc 3.10

2019-03-21 Thread Mark Adams via petsc-dev
I'm probably screwing up some sort of history by jumping into dev, but this is a dev comment ... (1) -matptap_via hypre: This call the hypre package to do the PtAP trough > an all-at-once triple product. In our experiences, it is the most memory > efficient, but could be slow. > FYI, I visited