(1) Are you sure that mpiexec is the correct mpiexec for that build of PETSc? 
Run mpiexec -n 2 ./ex23 -info   Likely it is not and it is running the program 
twice and each one thinks it is the entire world and hence each of the two run 
sequentially and print their own thing.

(2) Likely this is an outdated comment from when it only handled square sparse 
matrices.

  Barry



On Jan 17, 2011, at 4:46 PM, Gaurish Telang wrote:

> Hi.
> 
> I had two questions
> 
> (1) 
> 
> I was curious to know why the following happens with the PETSc standard 
> output. Having created the executable 'test' when I try to run it with 
> mpiexec -n 2 ./test 
> the same output is printed to the terminal twice. If I use 3 processors, then 
> the same output is printed thrice.
> 
> In short the number of processors = number of times the output from PETSc is 
> printed. Could this be a mistake with my PETSc installation???
> 
> For example, consider the code in src/ksp/ksp/examples/tutorials/ex23.c After 
> creating ex23 the executable and running it with two processors gives the 
> following terminal output:
> 
> gaurish108 at 
> gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>  mpiexec -n 1 ./ex23
> KSP Object:
>   type: gmres
>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt 
> Orthogonalization with no iterative refinement
>     GMRES: happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object:
>   type: jacobi
>   linear system matrix = precond matrix:
>   Matrix Object:
>     type=seqaij, rows=10, cols=10
>     total: nonzeros=28, allocated nonzeros=50
>       not using I-node routines
> Norm of error < 1.e-12, Iterations 5
> gaurish108 at 
> gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>  mpiexec -n 2 ./ex23
> KSP Object:
>   type: gmres
>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt 
> Orthogonalization with no iterative refinement
>     GMRES: happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object:
>   type: jacobi
>   linear system matrix = precond matrix:
>   Matrix Object:
>     type=seqaij, rows=10, cols=10
>     total: nonzeros=28, allocated nonzeros=50
>       not using I-node routines
> Norm of error < 1.e-12, Iterations 5
> KSP Object:
>   type: gmres
>     GMRES: restart=30, using Classical (unmodified) Gram-Schmidt 
> Orthogonalization with no iterative refinement
>     GMRES: happy breakdown tolerance 1e-30
>   maximum iterations=10000, initial guess is zero
>   tolerances:  relative=1e-07, absolute=1e-50, divergence=10000
>   left preconditioning
>   using PRECONDITIONED norm type for convergence test
> PC Object:
>   type: jacobi
>   linear system matrix = precond matrix:
>   Matrix Object:
>     type=seqaij, rows=10, cols=10
>     total: nonzeros=28, allocated nonzeros=50
>       not using I-node routines
> Norm of error < 1.e-12, Iterations 5
> gaurish108 at 
> gaurish108-laptop:~/Desktop/ResearchMeetings/SUPERPETS/petsc-3.1-p5/src/ksp/ksp/examples/tutorials$
>  
> 
> 
> 
> (2) 
> 
> Also I was told yesterday on the PETSC users mailing list that the MATLAB m 
> file PetscBinaryWrite.m converts a sparse matrix in MATLAB into Petsc Binary 
> format. 
>     The following are the comments in the code near the heading saying that 
> it works only for square sparse matrices . But it seems to be working quite 
> well for rectangular sparse MATLAB matrices also. 
> I have tested this in conjunction with PetscBinaryRead.m also, which reads in 
> a Petsc binary file into MATLAB as a sparse matrix.   
> 
> Is there something I might have missed or some error that I might be 
> making???   
> 
> Comments in PetscBinaryWrite.m
> "-================================================
> %  Writes in PETSc binary file sparse matrices and vectors
> %  if the array is multidimensional and dense it is saved
> %  as a one dimensional array
> %
> %  Only works for square sparse matrices 
> %:
> ..
> ..
> ..
> ..
> ..
> ..
> .
> .
> .
> 
> 
> 
>  

Reply via email to