I have implemented the efficient MatView() for MPIBAIJ matrices in parallel 
for binary storage in petsc-dev 
http://www.mcs.anl.gov/petsc/petsc-as/developers/index.html with this you can 
save parallel MPIBAIJ matrices without first converting to AIJ format.


   Barry

On Sep 21, 2010, at 9:30 AM, Gaetan Kenway wrote:

> Hello
> 
> I am a PETSc user and have run into a problem using MatView. I am trying to 
> output a matrix to file so I can load it instead of computing it for faster 
> debugging. The matrix I'm trying to output is drdwt. It is a parallel block 
> aij matrix with block size of 5. The matrix is assembled and the following 
> code works when I run it in serial:
> 
>  call 
> PetscViewerBinaryOpen(sumb_petsc_comm_world,drdw_name,FILE_MODE_WRITE,bin_viewer,ierr)
>  call MatView(drdwt,bin_viewer,ierr)
>  call PetscViewerDestroy(bin_viewer,ierr)
> 
> The matrix size is approximately 300k by 300k and I get an output file that 
> is approximately 245MB in size which is expected.  However, when I run the 
> same code in parallel on 3 processors it hangs at the MatView call until I am 
> forced to kill the processes. I've let it go for 20 minutes with no sign of 
> stopping.
> 
> I am not sure what is causing this.  I'm using openmpi-1.4.1  and petsc3.1 on 
> 32 bit Ubuntu 10.10.
> 
> Thank you,
> 
> Gaetan Kenway
> 
> 

Reply via email to