> El 31 may 2017, a las 21:46, Kannan, Ramakrishnan <kann...@ornl.gov> escribió:
> 
> Hello,
>  
> I have got a sparse 1D row distributed matrix in which every MPI process owns 
> an m/p x n of the global matrix mxn.  I am running NHEP with krylovschur on 
> it. It is throwing me some wrong error. For your reference, I have attached 
> the modified ex5.c in which I SetSizes on the matrix to emulate the 1D row 
> distribution and the log file with the error.
>  
> In the unmodified ex5.c, for m=5, N=15, the local_m and the local_n is 3x3. 
> How is the global 15x15 matrix distributed locally as 3x3 matrices? When I 
> print the global matrix, it doesn’t appear to be diagonal as well. 
>  
> If slepc doesn’t support sparse 1D row distributed matrix, how do I need to 
> redistribute it such that I can run NHEP on this.
> -- 
> Regards,
> Ramki
>  
> <ex5.c><slepc.o607511>

As explained in the manpage, the local columns size n must match the local size 
of the x vector, so it must also be N/mpisize
http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetSizes.html

But be warned that your code will not work when N is not divisible by mpisize. 
In that case, global and local dimensions won't match.

Setting local sizes is not necessary in your case, since by default PETSc is 
already doing a 1D block-row distribution.

Jose

Reply via email to