Re: [petsc-users] create global vector in latest version of petsc

2016-10-05 Thread Hengjie Wang
Hi, There is no error now. Thank you so much. Frank On 10/5/2016 8:50 PM, Barry Smith wrote: Sorry, as indicated in http://www.mcs.anl.gov/petsc/documentation/changes/dev.html in order to get the previous behavior of DMDACreate3d() you need to follow it with the two lines

Re: [petsc-users] create global vector in latest version of petsc

2016-10-05 Thread Barry Smith
Sorry, as indicated in http://www.mcs.anl.gov/petsc/documentation/changes/dev.html in order to get the previous behavior of DMDACreate3d() you need to follow it with the two lines DMSetFromOptions(da); DMSetUp(da); Barry > On Oct 5, 2016, at 9:11 PM, Hengjie Wang

Re: [petsc-users] issue with NullSpaceRemove in parallel

2016-10-05 Thread Barry Smith
The message "Scalar value must be same on all processes, argument # 2" comes up often when a Nan or Inf as gotten into the computation. The IEEE standard for floating point operations defines that Nan != Nan; I recommend running again with -fp_trap this should cause the code to stop

[petsc-users] issue with NullSpaceRemove in parallel

2016-10-05 Thread Mohammad Mirzadeh
Hi folks, I am trying to track down a bug that is sometimes triggered when solving a singular system (poisson+neumann). It only seems to happen in parallel and halfway through the run. I can provide detailed information about the actual problem, but the error message I get boils down to this:

Re: [petsc-users] create global vector in latest version of petsc

2016-10-05 Thread Hengjie Wang
Hi, I just tried .F90. It had the error. I attached the full error log. Thank you. Frank On 10/5/2016 6:57 PM, Barry Smith wrote: PETSc fortran programs should always end with .F90 not .f90 can you try again with that name? The capital F is important. Barry On Oct 5, 2016, at 7:57

Re: [petsc-users] create global vector in latest version of petsc

2016-10-05 Thread Barry Smith
PETSc fortran programs should always end with .F90 not .f90 can you try again with that name? The capital F is important. Barry > On Oct 5, 2016, at 7:57 PM, frank wrote: > > Hi, > > I update petsc to the latest version by pulling from the repo. Then I find > one of

Re: [petsc-users] create global vector in latest version of petsc

2016-10-05 Thread Hengjie Wang
Hi, I did. I am using GNU compiler 5.4.0. I don't know if this matters. Thank you Frank On 10/5/2016 6:08 PM, Matthew Knepley wrote: On Wed, Oct 5, 2016 at 7:57 PM, frank > wrote: Hi, I update petsc to the latest version by pulling from

Re: [petsc-users] create global vector in latest version of petsc

2016-10-05 Thread Matthew Knepley
On Wed, Oct 5, 2016 at 7:57 PM, frank wrote: > Hi, > > I update petsc to the latest version by pulling from the repo. Then I find > one of my old code, which worked before, output errors now. > After debugging, I find that the error is caused by "DMCreateGlobalVector". > I

[petsc-users] create global vector in latest version of petsc

2016-10-05 Thread frank
Hi, I update petsc to the latest version by pulling from the repo. Then I find one of my old code, which worked before, output errors now. After debugging, I find that the error is caused by "DMCreateGlobalVector". I attach a short program which can re-produce the error. This program works

Re: [petsc-users] large PetscCommDuplicate overhead

2016-10-05 Thread Barry Smith
> On Oct 5, 2016, at 2:30 PM, Matthew Overholt wrote: > > Hi Petsc-Users, > > I am trying to understand an issue where PetscCommDuplicate() calls are > taking an increasing percentage of time as I run a fixed-sized problem on > more processes. > > I am using the FEM

Re: [petsc-users] large PetscCommDuplicate overhead

2016-10-05 Thread Matthew Knepley
On Wed, Oct 5, 2016 at 2:30 PM, Matthew Overholt wrote: > Hi Petsc-Users, > > > > I am trying to understand an issue where PetscCommDuplicate() calls are > taking an increasing percentage of time as I run a fixed-sized problem on > more processes. > > > > I am using the FEM

[petsc-users] large PetscCommDuplicate overhead

2016-10-05 Thread Matthew Overholt
Hi Petsc-Users, I am trying to understand an issue where PetscCommDuplicate() calls are taking an increasing percentage of time as I run a fixed-sized problem on more processes. I am using the FEM to solve the steady-state heat transfer equation (K.x = q) using a PC direct solver, like

Re: [petsc-users] using DMDA with python

2016-10-05 Thread Dave May
On 5 October 2016 at 18:49, Matthew Knepley wrote: > On Wed, Oct 5, 2016 at 11:19 AM, E. Tadeu wrote: > >> Matt, >> >> Do you know if there is any example of solving Navier Stokes using a >> staggered approach by using a different DM object such as

Re: [petsc-users] using DMDA with python

2016-10-05 Thread Matthew Knepley
On Wed, Oct 5, 2016 at 11:19 AM, E. Tadeu wrote: > Matt, > > Do you know if there is any example of solving Navier Stokes using a > staggered approach by using a different DM object such as DMPlex? > SNES ex62 can do P2/P1 Stokes, which is similar. Is that what you want to

Re: [petsc-users] Vector with ghost values using DMDA

2016-10-05 Thread Praveen C
Thanks to all. Your answers were very helpful. Best praveen

Re: [petsc-users] using DMDA with python

2016-10-05 Thread E. Tadeu
Matt, Do you know if there is any example of solving Navier Stokes using a staggered approach by using a different DM object such as DMPlex? Thanks, Edson On Tue, Oct 4, 2016 at 11:12 PM, Matthew Knepley wrote: > On Tue, Oct 4, 2016 at 9:02 PM, Somdeb Bandopadhyay

Re: [petsc-users] using DMDA with python

2016-10-05 Thread Matthew Knepley
On Tue, Oct 4, 2016 at 9:47 PM, Somdeb Bandopadhyay wrote: > Hi again, >Please allow me to explain in detail here:- > >1. I am using Zang's (jcp 1994) method for incompressible flow on >generalized collocated grid. >2. The main difference lies on the

Re: [petsc-users] Vector with ghost values using DMDA

2016-10-05 Thread Matthew Knepley
On Wed, Oct 5, 2016 at 7:54 AM, Praveen C wrote: > Dear all > > I am using DMDA and create a vector with > > DMCreateGlobalVector > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMCreateLocalVector.html Matt > However this does not have ghost values.

Re: [petsc-users] Vector with ghost values using DMDA

2016-10-05 Thread Jed Brown
Praveen C writes: > So I have to create a global vector AND a local vector using > DMCreateLocalVector. > > Then I do DMGlobalToLocalBegin/End. Does this not lead to too much copying > ? It's typically more efficient -- the solver gets to work with contiguous vectors and

Re: [petsc-users] How to broadcast a double value to all the nodes in the cluster with Petsc

2016-10-05 Thread Hong
丁老师 : >How to broadcast a double value to all the nodes in the cluster with > Petsc > MPI_Bcast(). Hong > > > > > > > > > > > > > > > >

Re: [petsc-users] Vector with ghost values using DMDA

2016-10-05 Thread Hong
Praveen : DMGetLocalVector(). See petsc/src/snes/examples/tutorials/ex19.c Hong > So I have to create a global vector AND a local vector using > DMCreateLocalVector. > > Then I do DMGlobalToLocalBegin/End. Does this not lead to too much copying > ? I see there is VecCreateGhost but no such thing

Re: [petsc-users] Vector with ghost values using DMDA

2016-10-05 Thread Praveen C
So I have to create a global vector AND a local vector using DMCreateLocalVector. Then I do DMGlobalToLocalBegin/End. Does this not lead to too much copying ? I see there is VecCreateGhost but no such thing for DMDA ? Best praveen PS: Would be nice if the reply-to was set to mailing list. I

Re: [petsc-users] Vector with ghost values using DMDA

2016-10-05 Thread Jed Brown
Praveen C writes: > Dear all > > I am using DMDA and create a vector with > > DMCreateGlobalVector > > > However this does not have ghost values. How should I create vector if I > want to access ghost values ? That's what local vectors are for. signature.asc Description:

[petsc-users] Vector with ghost values using DMDA

2016-10-05 Thread Praveen C
Dear all I am using DMDA and create a vector with DMCreateGlobalVector However this does not have ghost values. How should I create vector if I want to access ghost values ? Thanks praveen

Re: [petsc-users] How to broadcast a double value to all the nodes in the cluster with Petsc

2016-10-05 Thread Patrick Sanan
PETSc, by design, does not wrap any of the existing functionality of MPI, so this would be accomplished with an MPI function like MPI_Bcast(). On Wed, Oct 5, 2016 at 11:02 AM, 丁老师 wrote: > Dear professor: >How to broadcast a double value to all the nodes in the cluster

[petsc-users] How to broadcast a double value to all the nodes in the cluster with Petsc

2016-10-05 Thread 丁老师
Dear professor: How to broadcast a double value to all the nodes in the cluster with Petsc