And the MPI and PETSc test with segment fault.
This is the final goal. Many thanks to you Jed.
Paul
Huaibao (Paul) Zhang
*Gas Surface Interactions Lab*
Department of Mechanical Engineering
University of Kentucky,
Lexington,
KY, 40506-0503
*Office*: 216 Ralph G. Anderson Building
*Web*:gsil.engineering.uky.edu
On Mon, Dec 1, 2014 at 4:39 PM, paul zhang <[email protected]> wrote:
> I better send you original files. The compressed files triggered some
> warnings I guess.
> Attached is the MPI test been verified.
>
> Huaibao (Paul) Zhang
> *Gas Surface Interactions Lab*
> Department of Mechanical Engineering
> University of Kentucky,
> Lexington,
> KY, 40506-0503
> *Office*: 216 Ralph G. Anderson Building
> *Web*:gsil.engineering.uky.edu
>
> On Mon, Dec 1, 2014 at 4:33 PM, paul zhang <[email protected]>
> wrote:
>
>> Hi Jed,
>>
>> Now I see PETSc is compiled correctly. However, when I attempted to call
>> "petscksp.h" in my own program (quite simple one), it failed for some
>> reason. Attached you can see two cases. The first is just the test of MPI,
>> which is fine. The second is one added PETSc, which has segment fault as it
>> went to
>>
>> MPI_Comm_rank (MPI_COMM_WORLD, &rank); /* get current
>> process id */
>>
>> Can you shed some light? The MPI version is 1.8.3.
>>
>> Thanks,
>> Paul
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> Huaibao (Paul) Zhang
>> *Gas Surface Interactions Lab*
>> Department of Mechanical Engineering
>> University of Kentucky,
>> Lexington,
>> KY, 40506-0503
>> *Office*: 216 Ralph G. Anderson Building
>> *Web*:gsil.engineering.uky.edu
>>
>> On Mon, Dec 1, 2014 at 4:20 PM, paul zhang <[email protected]>
>> wrote:
>>
>>>
>>> Sorry. I should reply it to the lists.
>>>
>>> [hzh225@dlxlogin2-2 petsc-3.5.2]$ make
>>> PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 PETSC_ARCH=linux-gnu-intel
>>> test
>>>
>>> Running test examples to verify correct installation
>>> Using PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2 and
>>> PETSC_ARCH=linux-gnu-intel
>>> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1
>>> MPI process
>>> C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2
>>> MPI processes
>>> Fortran example src/snes/examples/tutorials/ex5f run successfully with 1
>>> MPI process
>>> Completed test examples
>>> =========================================
>>> Now to evaluate the computer systems you plan use - do:
>>> make PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2
>>> PETSC_ARCH=linux-gnu-intel streams NPMAX=<number of MPI processes you
>>> intend to use>
>>>
>>>
>>> Huaibao (Paul) Zhang
>>> *Gas Surface Interactions Lab*
>>> Department of Mechanical Engineering
>>> University of Kentucky,
>>> Lexington,
>>> KY, 40506-0503
>>> *Office*: 216 Ralph G. Anderson Building
>>> *Web*:gsil.engineering.uky.edu
>>>
>>> On Mon, Dec 1, 2014 at 4:18 PM, Jed Brown <[email protected]> wrote:
>>>
>>>> paul zhang <[email protected]> writes:
>>>>
>>>> > Hi Jed,
>>>> > Does this mean I've passed the default test?
>>>>
>>>> It's an MPI test. Run this to see if PETSc solvers are running
>>>> correctly:
>>>>
>>>> make PETSC_DIR=/home/hzh225/LIB_CFD/nP/petsc-3.5.2
>>>> PETSC_ARCH=linux-gnu-intel test
>>>>
>>>> > Is the "open matplotlib " an issue?
>>>>
>>>> No, it's just a Python library that would be used to create a nice
>>>> figure if you had it installed.
>>>>
>>>
>>>
>>
>
set (CMAKE_CXX_COMPILER /home/hzh225/LIB_CFD/openmpi-1.8.3/bin/mpiCC)
set (CMAKE_CXX_FLAGS "-O3")
set (PETSC_INCLUDE_DIRS1 /home/hzh225/LIB_CFD/nP/petsc-3.5.2/include)
set (PETSC_INCLUDE_DIRS2
/home/hzh225/LIB_CFD/nP/petsc-3.5.2/linux-gnu-intel/include)
set (PETSC_LIBRARY_DIRS /home/hzh225/LIB_CFD/nP/petsc-3.5.2/linux-gnu-intel/lib)
set (VALGRIND_INCLUDE_DIR
/share/cluster/RHEL6.2/x86_64/apps/valgrind/3.9.0/include)
set (VALGRIND_LIBRARY_DIR /share/cluster/RHEL6.2/x86_64/apps/valgrind/3.9.0/lib)
cmake_minimum_required(VERSION 2.6)
project(kats)
set (kats_VERSION_MAJOR 2)
set (kats_VERSION_MINOR 0)
list (APPEND CMAKE_MODULE_PATH "${kats_SOURCE_DIR}/CMake")
# Pass some CMake settings to source code through a header file
configure_file (
"${PROJECT_SOURCE_DIR}/cmake_vars.h.in"
"${PROJECT_BINARY_DIR}/cmake_vars.h"
)
set (CMAKE_INSTALL_PREFIX ${PROJECT_SOURCE_DIR}/../)
# add to the include search path
include_directories("${PROJECT_SOURCE_DIR}")
include_directories(${PETSC_INCLUDE_DIRS1})
include_directories(${PETSC_INCLUDE_DIRS2})
include_directories(${VALGRIND_INCLUDE_DIR})
link_directories(${PETSC_LIBRARY_DIRS})
link_directories(${VALGRIND_LIBRARY_DIR})
set (EXTRA_LIBS petsc)
#add the executable
set (SOURCES
main.cc
cmake_vars.h
)
add_executable(kats ${SOURCES})
target_link_libraries (kats ${EXTRA_LIBS})
install (TARGETS kats RUNTIME DESTINATION bin)
#include "petscksp.h"
#include <mpi.h>
#include<iostream>
#include <fstream>
#include <stdio.h>
#include <stdlib.h>
#include <iomanip>
#include <vector>
#include <cmath>
#include <string>
#include <sstream>
#include <limits>
static char help[] = "Solves a tridiagonal linear system with KSP.\n\n";
using namespace std;
int Rank,np;
int PetscPrintError(const char error[],...){
if (Rank==0) cerr << "PETSc Error ... exiting" << endl;
exit(1);
return 0;
}
int main(int argc, char **argv){
int rank, size;
cout<<"before intialization"<<endl;
MPI_Init (&argc, &argv); /* starts MPI */
cout<<"after intialization"<<endl;
cout<<"before rank"<<endl;
MPI_Comm_rank (MPI_COMM_WORLD, &rank); /* get current process id */
cout<<"after rank"<<endl;
MPI_Comm_size (MPI_COMM_WORLD, &size); /* get number of processes */
printf( "Hello world from process %d of %d\n", rank, size );
PetscInitialize(&argc,&argv,(char *)0,help);
PetscErrorPrintf = PetscPrintError;
MPI_Finalize();
return 0;
}