> On Dec 1, 2014, at 3:40 PM, paul zhang <[email protected]> wrote:
> 
> And the MPI and PETSc test with segment fault. 

   What do you mean by this? Previously you sent

Using PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 and 
PETSC_ARCH=linux-gnu-intel
C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 MPI 
process
C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 MPI 
processes

indicating the PETSc test ran ok in parallel.

   Barry

> 
> This is the final goal. Many thanks to you Jed. 
> Paul
> 
> Huaibao (Paul) Zhang
> Gas Surface Interactions Lab
> Department of Mechanical Engineering
> University of Kentucky,
> Lexington,
> KY, 40506-0503
> Office: 216 Ralph G. Anderson Building
> Web:gsil.engineering.uky.edu
> 
> On Mon, Dec 1, 2014 at 4:39 PM, paul zhang <[email protected]> wrote:
> I better send you original files. The compressed files triggered some 
> warnings I guess. 
> Attached is the MPI test been verified. 
> 
> Huaibao (Paul) Zhang
> Gas Surface Interactions Lab
> Department of Mechanical Engineering
> University of Kentucky,
> Lexington,
> KY, 40506-0503
> Office: 216 Ralph G. Anderson Building
> Web:gsil.engineering.uky.edu
> 
> On Mon, Dec 1, 2014 at 4:33 PM, paul zhang <[email protected]> wrote:
> Hi Jed,
> 
> Now I see PETSc is compiled correctly. However, when I attempted to call 
> "petscksp.h" in my own program (quite simple one), it failed for some reason. 
> Attached you can see two cases. The first is just the test of MPI, which is 
> fine. The second is one added PETSc, which has segment fault as it went to 
> 
>         MPI_Comm_rank (MPI_COMM_WORLD, &rank);        /* get current process 
> id */
> 
> Can you shed some light? The MPI version is 1.8.3. 
> 
> Thanks,
> Paul
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Huaibao (Paul) Zhang
> Gas Surface Interactions Lab
> Department of Mechanical Engineering
> University of Kentucky,
> Lexington,
> KY, 40506-0503
> Office: 216 Ralph G. Anderson Building
> Web:gsil.engineering.uky.edu
> 
> On Mon, Dec 1, 2014 at 4:20 PM, paul zhang <[email protected]> wrote:
> 
> Sorry. I should reply it to the lists.
> 
> [hzh225@dlxlogin2-2 petsc-3.5.2]$ make 
> PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 PETSC_ARCH=linux-gnu-intel test
> 
> Running test examples to verify correct installation
> Using PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 and 
> PETSC_ARCH=linux-gnu-intel
> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 MPI 
> process
> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 MPI 
> processes
> Fortran example src/snes/examples/tutorials/ex5f run successfully with 1 MPI 
> process
> Completed test examples
> =========================================
> Now to evaluate the computer systems you plan use - do:
> make PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 PETSC_ARCH=linux-gnu-intel 
> streams NPMAX=<number of MPI processes you intend to use>
> 
> 
> Huaibao (Paul) Zhang
> Gas Surface Interactions Lab
> Department of Mechanical Engineering
> University of Kentucky,
> Lexington,
> KY, 40506-0503
> Office: 216 Ralph G. Anderson Building
> Web:gsil.engineering.uky.edu
> 
> On Mon, Dec 1, 2014 at 4:18 PM, Jed Brown <[email protected]> wrote:
> paul zhang <[email protected]> writes:
> 
> > Hi Jed,
> > Does this mean I've passed the default test?
> 
> It's an MPI test.  Run this to see if PETSc solvers are running correctly:
> 
>   make PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 
> PETSC_ARCH=linux-gnu-intel test
> 
> > Is the "open matplotlib " an issue?
> 
> No, it's just a Python library that would be used to create a nice
> figure if you had it installed.
> 
> 
> 
> 
> <CMakeLists.txt><main.cc>

Reply via email to