> On May 31, 2017, at 4:14 PM, Kannan, Ramakrishnan <[email protected]> wrote:
> 
> Jose,
> 
> Thank you for the quick reply. 
> 
> In this specific example, there are 5 mpi processes and each process owns an 
> 1D row distributed matrix of  size 3x15. According to the MatSetSizes, I 
> should set local rows, local cols, global rows, global cols which in this 
> case are 3,15,15,15 respectively. Instead why would I set 3,3,15,15.

   You have not read carefully the definition of "local size" for matrices in 
PETSc. 

http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetSizes.html

> 
> Also in our program, I use global_row_idx, global_col_idx for MatSetValues. 
> If I set 3,3,15,15 instead of 3,15,15,15, my MatSetValues fails with the 
> error “nnz cannot be greater than row length:”.

   This is a different problem that may need to be tracked down.

> Also to test the 3,15,15,15 in MatSetSizes to be right, we called a 
> MatCreateVec and MatMult of petsc which seemed to work alright too. 

   This will not work under normal circumstances so something else must be 
different as well.

   Barry

> 
> Appreciate your kind help.  
> -- 
> Regards,
> Ramki
> 
> 
> On 5/31/17, 4:26 PM, "Jose E. Roman" <[email protected]> wrote:
> 
> 
>> El 31 may 2017, a las 21:46, Kannan, Ramakrishnan <[email protected]> 
>> escribió:
>> 
>> Hello,
>> 
>> I have got a sparse 1D row distributed matrix in which every MPI process 
>> owns an m/p x n of the global matrix mxn.  I am running NHEP with 
>> krylovschur on it. It is throwing me some wrong error. For your reference, I 
>> have attached the modified ex5.c in which I SetSizes on the matrix to 
>> emulate the 1D row distribution and the log file with the error.
>> 
>> In the unmodified ex5.c, for m=5, N=15, the local_m and the local_n is 3x3. 
>> How is the global 15x15 matrix distributed locally as 3x3 matrices? When I 
>> print the global matrix, it doesn’t appear to be diagonal as well. 
>> 
>> If slepc doesn’t support sparse 1D row distributed matrix, how do I need to 
>> redistribute it such that I can run NHEP on this.
>> -- 
>> Regards,
>> Ramki
>> 
>> <ex5.c><slepc.o607511>
> 
>    As explained in the manpage, the local columns size n must match the local 
> size of the x vector, so it must also be N/mpisize
>    
> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Mat/MatSetSizes.html
> 
>    But be warned that your code will not work when N is not divisible by 
> mpisize. In that case, global and local dimensions won't match.
> 
>    Setting local sizes is not necessary in your case, since by default PETSc 
> is already doing a 1D block-row distribution.
> 
>    Jose
> 
> 
> 
> 

Reply via email to