Sure, this is the log file for the case with KSPGMRES and PCBJACOBI on 2
GPUs.

Andrea


On Sun, Jan 19, 2014 at 10:28 PM, Jed Brown <[email protected]> wrote:

> Andrea Lani <[email protected]> writes:
>
> > Dear Devs,
> >
> > first of all, even after the latest Karl's fix, my GPU-enabled solver
> > (using PETSc's KSPGMRES and PCASM) still gives a substantially different
> > residual after one time step in 1 or more GPUs:
> >
> > +0.72 (1 GPU) against -1.00 (2-8 GPU).
> >
> > The situtation is even more dramatic for PCBJACOBI: in this case it works
> > the same on 1 GPU and crashes on more GPUs ( "Null argument, when
> expecting
> > valid pointer! [0]PETSC ERROR: Trying to zero at a null pointer!").
>
> *always* send the entire error message
>



-- 
Dr. Andrea
Lani
Senior Research Engineer, PhD
Aeronautics & Aerospace dept., CFD group
Von Karman Institute for Fluid Dynamics
Chausse de Waterloo 72,
B-1640, Rhode-Saint-Genese,  Belgium
fax  : +32-2-3599600
work : +32-2-3599769
*[email protected] <[email protected]>*
--- coolfluid-solver ----------------------------------------

starting in directory [/home/lani/CF_2013.9/OPENMPI_FIXED_DEBUG/cuda/src/Solver]

called with arguments:
arg [0] : [./coolfluid-solver]
arg [1] : [--scase]
arg [2] : [/home/lani/CF_2013.9/plugins/MHD/testcases/Nozzle3D/cudaImpl.CFcase]
arg [3] : [-log_summary]

-------------------------------------------------------------
places to search for libraries ...
[
	libpath [/home/lani/CF_2013.9/OPENMPI_FIXED_DEBUG/cuda/dso]
]
-------------------------------------------------------------

-------------------------------------------------------------
COOLFluiD Environment
-------------------------------------------------------------
--- coolfluid-solver ----------------------------------------

starting in directory [/home/lani/CF_2013.9/OPENMPI_FIXED_DEBUG/cuda/src/Solver]

called with arguments:
arg [0] : [./coolfluid-solver]
arg [1] : [--scase]
arg [2] : [/home/lani/CF_2013.9/plugins/MHD/testcases/Nozzle3D/cudaImpl.CFcase]
arg [3] : [-log_summary]

-------------------------------------------------------------
places to search for libraries ...
[
	libpath [/home/lani/CF_2013.9/OPENMPI_FIXED_DEBUG/cuda/dso]
]
-------------------------------------------------------------

-------------------------------------------------------------
COOLFluiD Environment
-------------------------------------------------------------
COOLFluiD version    : 2013.9 Kernel 2.5.0 ( r14864:14878M, MPI, CUDA )
Parallel Environment : MPI
Build system         : CMake 2.8.3
Build OS             : Linux-2.6.32-358.6.1.el6.x86_64 [64bits]
Build processor      : x86_64
-------------------------------------------------------------
Removing previous config logs
removing file: P0-output.log
removing file: config-debug-info.log
removing file: config-p0.log
removing file: config-p1.log
removing file: config.log
Base Dir set to: '/home/lani/CF_2013.9'
COOLFluiD version    : 2013.9 Kernel 2.5.0 ( r14864:14878M, MPI, CUDA )
Parallel Environment : MPI
Build system         : CMake 2.8.3
Build OS             : Linux-2.6.32-358.6.1.el6.x86_64 [64bits]
Build processor      : x86_64
-------------------------------------------------------------
Base Dir set to: '/home/lani/CF_2013.9'
##### CudaDeviceManager::printProperties() for device [0] #####
name = Tesla K10.G2.8GB
capability = 3.0
clock rate = 745000
total global mem = 3757637632
total constant mem = 65536
overlap execution and transfer = ENABLED
can map host memory = ENABLED
texture alignment = 512
multiprocessor count = 8
shared mem per block = 49152
registers per block = 65536
threads in warp = 32
max threads per block = 1024
max threads dimensions = 1024 1024 64
max grid dimensions = 2147483647 65535 65535
############################################################

CudaDeviceManager::configure() => NTHREADS_PER_BLOCK = 32, NBLOCKS = 2147483647 END
-------------------------------------------------------------
Creating Simulation Maestro
-------------------------------------------------------------
Creating Simulation
-------------------------------------------------------------
Configuration of Simulator
-------------------------------------------------------------
Working Dir set to: '/home/lani/CF_2013.9/plugins/MHD/testcases/Nozzle3D/'
Results Dir set to: '/home/lani/CF_2013.9/plugins/MHD/testcases/Nozzle3D/'
-------------------------------------------------------------
-------------------------------------------------------------
Loading external modules
Initiating environment of loaded modules
Initiated PETSc
-------------------------------------------------------------
-------------------------------------------------------------
#
###### STARTING SUBSYSTEM : SubSystem ######
#
-------------------------------------------------------------
Building SubSystem
Name : SubSystem
Type : StandardSubSystem
-------------------------------------------------------------
#
###### CONFIG PHASE #################
#
-------------------------------------------------------------
Setting Namespace : Default
MeshData          : Default
PhysicalModelName : MHD3DProjection
PhysicalModelType : MHD3DProjection
SubSysStatus      : SubSystemStatus
-------------------------------------------------------------
WARNING: reference values not set !!!
-------------------------------------------------------------
Creating MeshCreator : CFmeshFileReader
Configuring Method [CFmeshFileReader] in the Namespace: [Default]
 Renumber : 0
-------------------------------------------------------------
-------------------------------------------------------------
Creating MeshAdapterMethod : Null1
Configuring Method [Null1] in the Namespace: [Default]
-------------------------------------------------------------
-------------------------------------------------------------
Creating CouplerMethod : Null2
Configuring Method [Null2] in the Namespace: [Default]
-------------------------------------------------------------
-------------------------------------------------------------
Creating SpaceMethod : CellCenterFVM
Configuring Method [CellCenterFVM] in the Namespace: [Default]
CellCenterFVM: configureFluxSplitter()
CellCenterFVM: configureDiffusiveFluxComputer()
CellCenterFVM: configureVarSetTransformers()
CellCenterFVM: configureJacobianLinearizer()
FVMCC_FluxSplitter::setup() => linearizerName = MHD3DProjectionLinearCons
CellCenterFVM: configurePolyReconstructor()
CellCenterFVM: configureLimiter()
CellCenterFVM: configureNodalStatesExtrapolator()
CellCenterFVM: configureEquationFilters()
CellCenterFVM: configureSourceTermComputer()
CellCenterFVM: Using FluxSplitter: LaxFried
CellCenterFVM: Using Update VarSet: Cons
CellCenterFVM: Using Solution VarSet: Cons
CellCenterFVM: Using Diffusive VarSet: Null
CellCenterFVM: Using Linear VarSet: Cons
CellCenterFVM: Using NodalStatesExtrapolator: DistanceBased
SETUP type = LeastSquareP1Setup
SETUP name = Setup1
SETUP type = LeastSquareP1UnSetup
SETUP name = UnSetup1
CellCenterFVM: Using ComputeRHS: NumJacobLaxFriedMHD3DCons
INIT type = InitState
INIT name = InField
BC type = MirrorMHD3DProjectionFVMCC
BC name = Wall
BC type = SuperInletFVMCC
BC name = Inlet
BC type = SuperOutletMHD3DProjectionFVMCC
BC name = Outlet
BC type = MirrorMHD3DProjectionFVMCC
BC name = BCTop
BC type = MirrorMHD3DProjectionFVMCC
BC name = BCBottom
-------------------------------------------------------------
-------------------------------------------------------------
Creating ErrorEstimatorMethod : Null4
Configuring Method [Null4] in the Namespace: [Default]
-------------------------------------------------------------
-------------------------------------------------------------
Creating LinearSystemSolver : BwdEulerLSS
Configuring Method [BwdEulerLSS] in the Namespace: [Default]
Petsc PCType = PCBJACOBI
Petsc KSPType = KSPGMRES
Petsc Nb KSP spaces = 30
Petsc MatOrderingType = MATORDERING_RCM
Petsc MaxIter = 50
Petsc Relative Tolerance = 1e-05
Petsc Absolute Tolerance = 1e-30
-------------------------------------------------------------
-------------------------------------------------------------
Creating ConvergenceMethod : BwdEuler
Configuring Method [BwdEuler] in the Namespace: [Default]
-------------------------------------------------------------
-------------------------------------------------------------
Creating DataProcessing : Null7
Configuring Method [Null7] in the Namespace: [Default]
-------------------------------------------------------------
-------------------------------------------------------------
Creating DataProcessing : Null8
Configuring Method [Null8] in the Namespace: [Default]
-------------------------------------------------------------
-------------------------------------------------------------
Creating OutputFormatter : Tecplot
Configuring Method [Tecplot] in the Namespace: [Default]
-------------------------------------------------------------
-------------------------------------------------------------
Creating OutputFormatter : CFmesh
Configuring Method [CFmesh] in the Namespace: [Default]
-------------------------------------------------------------
WARNING : Unused User Configuration Arguments:
Simulator.SubSystem.CFmeshFileReader.Extruder2DFVM.ExtrudeSize 0.001
Simulator.SubSystem.CFmeshFileReader.Extruder2DFVM.NbLayers 3
Simulator.SubSystem.CFmeshFileReader.Extruder2DFVM.Split false
Simulator.SubSystem.CellCenterFVM.NumJacobLaxFriedMHD3DCons.NbKernelBlocks 64

#
###### SOCKETS PLUG PHASE ###########
#
#
###### BUILD PHASE ##################
#
-------------------------------------------------------------
Setting up all PhysicalModel's

-------------------------------------------------------------
-------------------------------------------------------------
Setting up Physical Model [MHD3DProjection]
-------------------------------------------------------------
-------------------------------------------------------------
Setting up all MeshCreator's
-------------------------------------------------------------
Setup Method [CFmeshFileReader]
-------------------------------------------------------------
Building MeshData's
-------------------------------------------------------------
-------------------------------------------------------------
MeshCreator [CFmeshFileReader] Generate or Read Mesh

Memory usage before building mesh: 203.676 MB

Memory Usage before assembling connectivity: 203.676 MB
Extrusion from 2D CFmesh to 3D CFmesh took: 0.207635s
Original NbEquations = 9
Final NbEquations    = 9
Memory Usage before assembling connectivity: 203.676 MB
Calling mesh partitioner
+++
ParMetis: ncommonnodes = 2
ParMetis::doPartition() took 0.029979
+++

0%   10   20   30   40   50   60   70   80   90   100%
|----|----|----|----|----|----|----|----|----|----|
***************************************************
Reading data from nozzle.CFmesh took 0.121525s
Memory Usage after mesh reading: 203.676 MB
Building TRS: InnerCells
FVMCC_MeshDataBuilder::renumberCells()
nbCells = 7599, nbStates = 7599
FVMCC BoundaryFaces [5429]
FVMCC Max Total NbFaces [37995]
FVMCC Total nb faces [21748]
FVMCC Inner nb faces [16247]
FVMCC Boundary and Partition faces [5501]
Building TRS: InnerFaces
FVM Faces created : 21748
Building TRS: SlipWall
Built TRS named SlipWall
Building TRS: SuperInlet
Built TRS named SuperInlet
Building TRS: SuperOutlet
Built TRS named SuperOutlet
Building TRS: Bottom
Built TRS named Bottom
Building TRS: Top
Built TRS named Top
Number of partition faces detected = 72
Building TRS: PartitionFaces
Built PartitionFaces TRS 
Building MeshData from /home/lani/CF_2013.9/plugins/MHD/testcases/Nozzle3D/nozzle.CFmesh took 0.162814s
Building the mesh took: 0.16334s

Memory usage after building mesh: 203.676 MB
-------------------------------------------------------------
-------------------------------------------------------------
MeshCreator [CFmeshFileReader] Building Mesh Data
-------------------------------------------------------------
Building TRS info for MeshData in Namespace Default
MPICommPattern<DATA>::BuildGhostMap() STARTMPICommPattern<DATA>::BuildGhostMap()  => Sync_BuildReceiveTypesMPICommPattern<DATA>::BuildGhostMap()  ENDMPICommPattern<DATA>::BuildGhostMap() STARTMPICommPattern<DATA>::BuildGhostMap()  => Sync_BuildReceiveTypesMPICommPattern<DATA>::BuildGhostMap()  END#
###### SETUP PHASE ##################
#
-------------------------------------------------------------
Setting up DataPreProcessing's
Setup Method [Null7]
-------------------------------------------------------------
Setting up MeshAdapterMethod's
Setup Method [Null1]
-------------------------------------------------------------
Setting up ConvergenceMethod's
Setup Method [BwdEuler]
-------------------------------------------------------------
Setting up LinearSystemSolver's
Setup Method [BwdEulerLSS]
MPICommPattern<DATA>::reserve() -> m_data->GetTotalSize() = 7599
-------------------------------------------------------------
Setting up SpaceMethod's
Setup Method [CellCenterFVM]
FVMCC_ComputeRHSCell::setup() => Kernel grid sizes: <<< [7599 - 1], 32>>>
7599 >= 7599
7893 >= 7893
TRS Wall has zero gradient flags = 0 0 0 0 0 0 0 0 0 
TRS Inlet has zero gradient flags = 0 0 0 0 0 0 0 0 0 
TRS Outlet has zero gradient flags = 0 0 0 0 0 0 0 0 0 
TRS BCTop has zero gradient flags = 0 0 0 0 0 0 0 0 0 
TRS BCBottom has zero gradient flags = 0 0 0 0 0 0 0 0 0 
-------------------------------------------------------------
Setting up ErrorEstimatorMethod's
Setup Method [Null4]
-------------------------------------------------------------
Setting up CouplerMethod's
Setup Method [Null2]
-------------------------------------------------------------
CouplerMethod's write interface coordinates
-------------------------------------------------------------
Writting coordinates of CouplerMethod [Null2]
-------------------------------------------------------------
-------------------------------------------------------------
CouplerMethod's read interface coordinates
-------------------------------------------------------------
Reading coordinates of CouplerMethod [Null2]
-------------------------------------------------------------
-------------------------------------------------------------
CouplerMethod's match mesh and write
-------------------------------------------------------------
Mesh matching of CouplerMethod [Null2]
-------------------------------------------------------------
-------------------------------------------------------------
CouplerMethod's read match mesh
-------------------------------------------------------------
Reading matched mesh of CouplerMethod [Null2]
-------------------------------------------------------------
-------------------------------------------------------------
Setting up DataPostProcessing's
Setup Method [Null8]
-------------------------------------------------------------
Setting up OutputFormatter's
Setup Method [Tecplot]
Setup Method [CFmesh]

-------------------------------------------------------------
Initializing solution
-------------------------------------------------------------
Initializing solution of method [CellCenterFVM]
-------------------------------------------------------------
Writing initial solution ... 
Writing solution to: /home/lani/CF_2013.9/plugins/MHD/testcases/Nozzle3D/nozzle3DFVMMHD1stProjImpl-P0.plt
Writing took 0.077142s
ParCFmeshFileWriter::writeToFile() => IO rank is 0
Element written 
Nodes written 
States written 
Writing solution to: /home/lani/CF_2013.9/plugins/MHD/testcases/Nozzle3D/nozzle3DFVMMHD1stProjImpl-P0.CFmesh
Writing took 0.203379s
-------------------------------------------------------------
#
###### RUN PHASE ####################
#
BwdEuler : assembled linear system in 0.354767s
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Null argument, when expecting valid pointer!
[0]PETSC ERROR: Trying to zero at a null pointer!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Development GIT revision: unknown  GIT Date: unknown
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ./coolfluid-solver on a arch-x86_64 named arkepler.private.vki.eu by lani Sun Jan 19 23:53:19 2014
[0]PETSC ERROR: Libraries linked from /home/lani/local/cf2_2013.9/openmpi/petsc_cuda_fixed/lib
[0]PETSC ERROR: Configure run at Sun Jan 19 22:11:29 2014
[0]PETSC ERROR: Configure options --prefix=/home/lani/local/cf2_2013.9/openmpi/petsc_cuda_fixed --with-debugging=1 COPTFLAGS="-O3 " FOPTFLAGS="-O3 " --with-mpi-dir=/home/lani/local/cf2_2013.9/openmpi --download-f2cblaslapack=1 --with-fortran=1 --with-shared-libraries=1 --with-cudac=/opt/cuda/5.0.35/bin/nvcc --with-cuda-dir=/opt/cuda/5.0.35 --with-cuda=1 --with-cusp=1 --with-thrust=1 --with-cusp-dir=/home/lani/local/cf2_2013.9 --PETSC_ARCH=arch-x86_64
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: PetscMemzero() line 1932 in /home/lani/petsc-dev/include/petscsys.h
[0]PETSC ERROR: VecSet_Seq() line 729 in /home/lani/petsc-dev/src/vec/vec/impls/seq/dvec2.c
[0]PETSC ERROR: VecSet() line 575 in /home/lani/petsc-dev/src/vec/vec/interface/rvector.c
[0]PETSC ERROR: KSPSolve() line 417 in /home/lani/petsc-dev/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: PCApply_BJacobi_Singleblock() line 675 in /home/lani/petsc-dev/src/ksp/pc/impls/bjacobi/bjacobi.c
[0]PETSC ERROR: PCApply() line 440 in /home/lani/petsc-dev/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSP_PCApply() line 227 in /home/lani/petsc-dev/include/petsc-private/kspimpl.h
[0]PETSC ERROR: KSPInitialResidual() line 64 in /home/lani/petsc-dev/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: KSPSolve_GMRES() line 234 in /home/lani/petsc-dev/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: KSPSolve() line 432 in /home/lani/petsc-dev/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: execute() line 122 in /home/lani/CF_2013.9/plugins/Petsc/StdParSolveSys.cxx
KSP convergence reached at iteration: 0
[1]PETSC ERROR: --------------------- Error Message ------------------------------------
[1]PETSC ERROR: Null argument, when expecting valid pointer!
[1]PETSC ERROR: Trying to zero at a null pointer!
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: Petsc Development GIT revision: unknown  GIT Date: unknown
[1]PETSC ERROR: See docs/changes/index.html for recent updates.
[1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[1]PETSC ERROR: See docs/index.html for manual pages.
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: ./coolfluid-solver on a arch-x86_64 named arkepler.private.vki.eu by lani Sun Jan 19 23:53:19 2014
[1]PETSC ERROR: Libraries linked from /home/lani/local/cf2_2013.9/openmpi/petsc_cuda_fixed/lib
[1]PETSC ERROR: Configure run at Sun Jan 19 22:11:29 2014
[1]PETSC ERROR: Configure options --prefix=/home/lani/local/cf2_2013.9/openmpi/petsc_cuda_fixed --with-debugging=1 COPTFLAGS="-O3 " FOPTFLAGS="-O3 " --with-mpi-dir=/home/lani/local/cf2_2013.9/openmpi --download-f2cblaslapack=1 --with-fortran=1 --with-shared-libraries=1 --with-cudac=/opt/cuda/5.0.35/bin/nvcc --with-cuda-dir=/opt/cuda/5.0.35 --with-cuda=1 --with-cusp=1 --with-thrust=1 --with-cusp-dir=/home/lani/local/cf2_2013.9 --PETSC_ARCH=arch-x86_64
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: PetscMemzero() line 1932 in /home/lani/petsc-dev/include/petscsys.h
[1]PETSC ERROR: VecSet_Seq() line 729 in /home/lani/petsc-dev/src/vec/vec/impls/seq/dvec2.c
[1]PETSC ERROR: VecSet() line 575 in /home/lani/petsc-dev/src/vec/vec/interface/rvector.c
[1]PETSC ERROR: KSPSolve() line 417 in /home/lani/petsc-dev/src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: PCApply_BJacobi_Singleblock() line 675 in /home/lani/petsc-dev/src/ksp/pc/impls/bjacobi/bjacobi.c
[1]PETSC ERROR: PCApply() line 440 in /home/lani/petsc-dev/src/ksp/pc/interface/precon.c
[1]PETSC ERROR: KSP_PCApply() line 227 in /home/lani/petsc-dev/include/petsc-private/kspimpl.h
[1]PETSC ERROR: KSPInitialResidual() line 64 in /home/lani/petsc-dev/src/ksp/ksp/interface/itres.c
[1]PETSC ERROR: KSPSolve_GMRES() line 234 in /home/lani/petsc-dev/src/ksp/ksp/impls/gmres/gmres.c
[1]PETSC ERROR: KSPSolve() line 432 in /home/lani/petsc-dev/src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: execute() line 122 in /home/lani/CF_2013.9/plugins/Petsc/StdParSolveSys.cxx
BwdEuler : solved linear system in 0.168021s
Iter:     1      Res: [-1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 ]  CFL:     1000 CPUTime: 0.471928    Mem: 287.945 MB
BwdEuler : assembled linear system in 0.341927s
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Null argument, when expecting valid pointer!
[0]PETSC ERROR: Trying to zero at a null pointer!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Development GIT revision: unknown  GIT Date: unknown
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: ./coolfluid-solver on a arch-x86_64 named arkepler.private.vki.eu by lani Sun Jan 19 23:53:19 2014
[0]PETSC ERROR: Libraries linked from /home/lani/local/cf2_2013.9/openmpi/petsc_cuda_fixed/lib
[0]PETSC ERROR: Configure run at Sun Jan 19 22:11:29 2014
[0]PETSC ERROR: Configure options --prefix=/home/lani/local/cf2_2013.9/openmpi/petsc_cuda_fixed --with-debugging=1 COPTFLAGS="-O3 " FOPTFLAGS="-O3 " --with-mpi-dir=/home/lani/local/cf2_2013.9/openmpi --download-f2cblaslapack=1 --with-fortran=1 --with-shared-libraries=1 --with-cudac=/opt/cuda/5.0.35/bin/nvcc --with-cuda-dir=/opt/cuda/5.0.35 --with-cuda=1 --with-cusp=1 --with-thrust=1 --with-cusp-dir=/home/lani/local/cf2_2013.9 --PETSC_ARCH=arch-x86_64
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: PetscMemzero() line 1932 in /home/lani/petsc-dev/include/petscsys.h
[0]PETSC ERROR: VecSet_Seq() line 729 in /home/lani/petsc-dev/src/vec/vec/impls/seq/dvec2.c
[0]PETSC ERROR: VecSet() line 575 in /home/lani/petsc-dev/src/vec/vec/interface/rvector.c
[0]PETSC ERROR: KSPSolve() line 417 in /home/lani/petsc-dev/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: PCApply_BJacobi_Singleblock() line 675 in /home/lani/petsc-dev/src/ksp/pc/impls/bjacobi/bjacobi.c
[0]PETSC ERROR: PCApply() line 440 in /home/lani/petsc-dev/src/ksp/pc/interface/precon.c
[0]PETSC ERROR: [1]PETSC ERROR: --------------------- Error Message ------------------------------------
[1]PETSC ERROR: Null argument, when expecting valid pointer!
[1]PETSC ERROR: Trying to zero at a null pointer!
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: Petsc Development GIT revision: unknown  GIT Date: unknown
[1]PETSC ERROR: See docs/changes/index.html for recent updates.
[1]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[1]PETSC ERROR: See docs/index.html for manual pages.
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: ./coolfluid-solver on a arch-x86_64 named arkepler.private.vki.eu by lani Sun Jan 19 23:53:19 2014
[1]PETSC ERROR: Libraries linked from /home/lani/local/cf2_2013.9/openmpi/petsc_cuda_fixed/lib
[1]PETSC ERROR: Configure run at Sun Jan 19 22:11:29 2014
[1]PETSC ERROR: Configure options --prefix=/home/lani/local/cf2_2013.9/openmpi/petsc_cuda_fixed --with-debugging=1 COPTFLAGS="-O3 " FOPTFLAGS="-O3 " --with-mpi-dir=/home/lani/local/cf2_2013.9/openmpi --download-f2cblaslapack=1 --with-fortran=1 --with-shared-libraries=1 --with-cudac=/opt/cuda/5.0.35/bin/nvcc --with-cuda-dir=/opt/cuda/5.0.35 --with-cuda=1 --with-cusp=1 --with-thrust=1 --with-cusp-dir=/home/lani/local/cf2_2013.9 --PETSC_ARCH=arch-x86_64
[1]PETSC ERROR: ------------------------------------------------------------------------
[1]PETSC ERROR: PetscMemzero() line 1932 in /home/lani/petsc-dev/include/petscsys.h
[1]PETSC ERROR: VecSet_Seq() line 729 in /home/lani/petsc-dev/src/vec/vec/impls/seq/dvec2.c
[1]PETSC ERROR: VecSet() line 575 in /home/lani/petsc-dev/src/vec/vec/interface/rvector.c
[1]PETSC ERROR: KSPSolve() line 417 in /home/lani/petsc-dev/src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: PCApply_BJacobi_Singleblock() line 675 in /home/lani/petsc-dev/src/ksp/pc/impls/bjacobi/bjacobi.c
[1]PETSC ERROR: PCApply() line 440 in /home/lani/petsc-dev/src/ksp/pc/interface/precon.c
[1]PETSC ERROR: KSP_PCApply() line 227 in /home/lani/petsc-dev/include/petsc-private/kspimpl.h
[1]PETSC ERROR: KSPInitialResidual() line 64 in /home/lani/petsc-dev/src/ksp/ksp/interface/itres.c
[1]PETSC ERROR: KSPSolve_GMRES() line 234 in /home/lani/petsc-dev/src/ksp/ksp/impls/gmres/gmres.c
[1]PETSC ERROR: KSPSolve() line 432 in /home/lani/petsc-dev/src/ksp/ksp/interface/itfunc.c
[1]PETSC ERROR: execute() line 122 in /home/lani/CF_2013.9/plugins/Petsc/StdParSolveSys.cxx
KSP_PCApply() line 227 in /home/lani/petsc-dev/include/petsc-private/kspimpl.h
[0]PETSC ERROR: KSPInitialResidual() line 64 in /home/lani/petsc-dev/src/ksp/ksp/interface/itres.c
[0]PETSC ERROR: KSPSolve_GMRES() line 234 in /home/lani/petsc-dev/src/ksp/ksp/impls/gmres/gmres.c
[0]PETSC ERROR: KSPSolve() line 432 in /home/lani/petsc-dev/src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: execute() line 122 in /home/lani/CF_2013.9/plugins/Petsc/StdParSolveSys.cxx
KSP convergence reached at iteration: 0
BwdEuler : solved linear system in 0.150527s
Iter:     2      Res: [-1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 -1.7976931e+308 ]  CFL:     1000 CPUTime: 0.935858    Mem: 288.484 MB
SubSystem WallTime: 1.02061s
#
###### UNSETUP PHASE ################
#
Total Number Iter: 2 Reached Residual: -1.79769e+308 and took: 0.935858 sec
Writing solution to: /home/lani/CF_2013.9/plugins/MHD/testcases/Nozzle3D/nozzle3DFVMMHD1stProjImpl-P0.plt
Writing took 0.077285s
ParCFmeshFileWriter::writeToFile() => IO rank is 0
Nodes written 
States written 
Writing solution to: /home/lani/CF_2013.9/plugins/MHD/testcases/Nozzle3D/nozzle3DFVMMHD1stProjImpl-P0.CFmesh
Writing took 0.122266s
-------------------------------------------------------------
#
###### SOCKETS UNPLUG PHASE #########
#
#
###### DESTRUCTION SUBSYSTEM PHASE #########
#
************************************************************************************************************************
***             WIDEN YOUR WINDOW TO 120 CHARACTERS.  Use 'enscript -r -fCourier9' to print this document            ***
************************************************************************************************************************

---------------------------------------------- PETSc Performance Summary: ----------------------------------------------

./coolfluid-solver on a arch-x86_64 named arkepler.private.vki.eu with 2 processors, by lani Sun Jan 19 23:54:15 2014
Using Petsc Development GIT revision: unknown  GIT Date: unknown

                         Max       Max/Min        Avg      Total 
Time (sec):           5.587e+01      1.00000   5.587e+01
Objects:              3.100e+01      1.00000   3.100e+01
Flops:                1.114e+08      1.05061   1.087e+08  2.174e+08
Flops/sec:            1.993e+06      1.05061   1.945e+06  3.891e+06
Memory:               8.543e+07      1.04033              1.675e+08
MPI Messages:         4.000e+00      1.00000   4.000e+00  8.000e+00
MPI Message Lengths:  4.908e+03      1.00000   1.227e+03  9.816e+03
MPI Reductions:       9.500e+01      1.00000

Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract)
                            e.g., VecAXPY() for real vectors of length N --> 2N flops
                            and VecAXPY() for complex vectors of length N --> 8N flops

Summary of Stages:   ----- Time ------  ----- Flops -----  --- Messages ---  -- Message Lengths --  -- Reductions --
                        Avg     %Total     Avg     %Total   counts   %Total     Avg         %Total   counts   %Total 
 0:      Main Stage: 5.5867e+01 100.0%  2.1737e+08 100.0%  8.000e+00 100.0%  1.227e+03      100.0%  9.400e+01  98.9% 

------------------------------------------------------------------------------------------------------------------------
See the 'Profiling' chapter of the users' manual for details on interpreting output.
Phase summary info:
   Count: number of times phase was executed
   Time and Flops: Max - maximum over all processors
                   Ratio - ratio of maximum to minimum over all processors
   Mess: number of messages sent
   Avg. len: average message length (bytes)
   Reduct: number of global reductions
   Global: entire computation
   Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop().
      %T - percent time in this phase         %f - percent flops in this phase
      %M - percent messages in this phase     %L - percent message lengths in this phase
      %R - percent reductions in this phase
   Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors)
------------------------------------------------------------------------------------------------------------------------


      ##########################################################
      #                                                        #
      #                          WARNING!!!                    #
      #                                                        #
      #   This code was compiled with a debugging option,      #
      #   To get timing results run ./configure                #
      #   using --with-debugging=no, the performance will      #
      #   be generally two or three times faster.              #
      #                                                        #
      ##########################################################


Event                Count      Time (sec)     Flops                             --- Global ---  --- Stage ---   Total
                   Max Ratio  Max     Ratio   Max  Ratio  Mess   Avg len Reduct  %T %f %M %L %R  %T %f %M %L %R Mflop/s
------------------------------------------------------------------------------------------------------------------------

--- Event Stage 0: Main Stage

ThreadCommRunKer       1 1.0 3.8147e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
ThreadCommBarrie       1 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecCopy                2 1.0 2.1617e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecAssemblyBegin       2 1.0 3.6960e-0349.2 0.00e+00 0.0 0.0e+00 0.0e+00 6.0e+00  0  0  0  0  6   0  0  0  0  6     0
VecAssemblyEnd         2 1.0 1.5974e-05 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
VecCUSPCopyFrom        3 1.0 2.8014e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatLUFactorNum         2 1.0 1.8086e-01 1.1 1.11e+08 1.1 0.0e+00 0.0e+00 0.0e+00  0100  0  0  0   0100  0  0  0  1202
MatILUFactorSym        2 1.0 4.2481e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00  0  0  0  0  2   0  0  0  0  2     0
MatAssemblyBegin       2 1.0 1.1921e-05 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatAssemblyEnd         2 1.0 5.1590e-02 1.3 0.00e+00 0.0 4.0e+00 1.3e+03 2.1e+01  0  0 50 53 22   0  0 50 53 22     0
MatGetRowIJ            2 1.0 6.1989e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatGetOrdering         2 1.0 6.4130e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00  0  0  0  0  4   0  0  0  0  4     0
MatZeroEntries         2 1.0 1.5367e-02 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
MatCUSPCopyTo          4 1.0 2.7948e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00  0  0  0  0  0   0  0  0  0  0     0
KSPSetUp               4 1.0 2.7692e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00  0  0  0  0  4   0  0  0  0  4     0
PCSetUp                4 1.0 2.3533e-01 1.0 1.11e+08 1.1 0.0e+00 0.0e+00 2.2e+01  0100  0  0 23   0100  0  0 23   924
PCSetUpOnBlocks        2 1.0 2.3529e-01 1.0 1.11e+08 1.1 0.0e+00 0.0e+00 1.4e+01  0100  0  0 15   0100  0  0 15   924
------------------------------------------------------------------------------------------------------------------------

Memory usage is given in bytes:

Object Type          Creations   Destructions     Memory  Descendants' Mem.
Reports information only for process 0.

--- Event Stage 0: Main Stage

              Vector    12             11      1596120     0
      Vector Scatter     1              0            0     0
              Matrix     5              2     77089112     0
       Krylov Solver     2              2        19536     0
      Preconditioner     2              2         1880     0
           Index Set     8              8       535184     0
              Viewer     1              0            0     0
========================================================================================================================
Average time to get PetscTime(): 0
Average time for MPI_Barrier(): 8.10623e-07
Average time for zero size MPI_Send(): 4.05312e-06
#PETSc Option Table entries:
--scase /home/lani/CF_2013.9/plugins/MHD/testcases/Nozzle3D/cudaImpl.CFcase
-log_summary
#End of PETSc Option Table entries
Compiled without FORTRAN kernels
Compiled with full precision matrices (default)
sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4
Configure run at: Sun Jan 19 22:11:29 2014
Configure options: --prefix=/home/lani/local/cf2_2013.9/openmpi/petsc_cuda_fixed --with-debugging=1 COPTFLAGS="-O3 " FOPTFLAGS="-O3 " --with-mpi-dir=/home/lani/local/cf2_2013.9/openmpi --download-f2cblaslapack=1 --with-fortran=1 --with-shared-libraries=1 --with-cudac=/opt/cuda/5.0.35/bin/nvcc --with-cuda-dir=/opt/cuda/5.0.35 --with-cuda=1 --with-cusp=1 --with-thrust=1 --with-cusp-dir=/home/lani/local/cf2_2013.9 --PETSC_ARCH=arch-x86_64
-----------------------------------------
Libraries compiled on Sun Jan 19 22:11:29 2014 on arkepler.private.vki.eu 
Machine characteristics: Linux-2.6.32-358.6.1.el6.x86_64-x86_64-with-redhat-6.3-Carbon
Using PETSc directory: /home/lani/petsc-dev
Using PETSc arch: arch-x86_64
-----------------------------------------

Using C compiler: /home/lani/local/cf2_2013.9/openmpi/bin/mpicc  -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -O3  ${COPTFLAGS} ${CFLAGS}
Using Fortran compiler: /home/lani/local/cf2_2013.9/openmpi/bin/mpif77  -fPIC -Wall -Wno-unused-variable -O3   ${FOPTFLAGS} ${FFLAGS} 
-----------------------------------------

Using include paths: -I/home/lani/petsc-dev/arch-x86_64/include -I/home/lani/petsc-dev/include -I/home/lani/petsc-dev/include -I/home/lani/petsc-dev/arch-x86_64/include -I/opt/cuda/5.0.35/include -I/home/lani/local/cf2_2013.9/ -I/home/lani/local/cf2_2013.9/include -I/home/lani/local/cf2_2013.9/openmpi/include
-----------------------------------------

Using C linker: /home/lani/local/cf2_2013.9/openmpi/bin/mpicc
Using Fortran linker: /home/lani/local/cf2_2013.9/openmpi/bin/mpif77
Using libraries: -Wl,-rpath,/home/lani/petsc-dev/arch-x86_64/lib -L/home/lani/petsc-dev/arch-x86_64/lib -lpetsc -Wl,-rpath,/home/lani/petsc-dev/arch-x86_64/lib -L/home/lani/petsc-dev/arch-x86_64/lib -lf2clapack -lf2cblas -lX11 -lpthread -Wl,-rpath,/opt/cuda/5.0.35/lib64 -L/opt/cuda/5.0.35/lib64 -lcufft -lcublas -lcudart -lcusparse -Wl,-rpath,/home/lani/local/cf2_2013.9/openmpi/lib -L/home/lani/local/cf2_2013.9/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.4.6 -L/usr/lib/gcc/x86_64-redhat-linux/4.4.6 -lmpi_f77 -lgfortran -lm -lm -lm -lm -lm -lm -lm -lm -lmpi_cxx -lstdc++ -lmpi_cxx -lstdc++ -ldl -lmpi -lrt -lnsl -lutil -lgcc_s -lpthread -ldl 
-----------------------------------------

WARNING! There are options you set that were not used!
WARNING! could be spelling mistake, etc!
Option left: name:--scase value: /home/lani/CF_2013.9/plugins/MHD/testcases/Nozzle3D/cudaImpl.CFcase
Terminated PETSc
Exit value 0
-------------------------------------------------------------
COOLFluiD Environment Terminated
-------------------------------------------------------------
Exit value 0

Reply via email to