Hi, Matt

Thank you. I see the point :-)

Cheers

Gao
________________________________
From: petsc-users-bounces at mcs.anl.gov [petsc-users-bounces at mcs.anl.gov] 
on behalf of Matthew Knepley [[email protected]]
Sent: Thursday, March 22, 2012 9:38 PM
To: PETSc users list
Subject: Re: [petsc-users] storage of parallel dense matrices and 
(anti)symmetric matrices

On Thu, Mar 22, 2012 at 3:32 PM, Gao Bin <bin.gao at uit.no<mailto:bin.gao at 
uit.no>> wrote:
Hi, Jed

Thank you very much for your quick reply. May I ask two more further questions?

(1) Why does not PETSc also partition the columns so that each processor could 
use less memory?

2D distributions are not efficient for sparse matrices. They are sometimes used 
for dense.

(2) If the matrix I use is a square matrix, the number of local columns "n" 
should be equal to the number of local rows "m" when calling MatCreateMPIDense, 
am I right?

Yes. You can always let PETSc choose by giving PETSC_DETERMINE.

   Matt

Thank you again for your answer.

Cheers

Gao
________________________________
From: petsc-users-bounces at mcs.anl.gov<mailto:petsc-users-bounces at 
mcs.anl.gov> [petsc-users-bounces at mcs.anl.gov<mailto:petsc-users-bounces at 
mcs.anl.gov>] on behalf of Jed Brown [jedbrown at 
mcs.anl.gov<mailto:[email protected]>]
Sent: Thursday, March 22, 2012 9:17 PM
To: PETSc users list
Subject: Re: [petsc-users] storage of parallel dense matrices and 
(anti)symmetric matrices

2012/3/22 Gao Bin <bin.gao at uit.no<mailto:bin.gao at uit.no>>
"The parallel dense matrices are partitioned by rows across the processors, so 
that each local rectangular submatrix is stored in the dense format described 
above."

Does it mean each processor will have several continuous rows and all columns 
of the matrix? If yes, why do we need to specify "n" -- the number of local 
columns when calling MatCreateMPIDense?

Interpret the local column size n as the local size of the Vec that the Mat 
will be applied to.


I am sorry to raise this simple question, since I have read the manual and 
tutorials, but I have not found a clear answer. Moreover, the reason I am 
asking this question is that I would like to use PETSc for matrix operations, 
but the elements of matrices need to be calculate via my own code. If I know 
the distribution of the matrix, I could let each processor only calculate and 
set local values (the rows and columns possessed on the processor itself) for 
efficiency.

My second question is if PETSc provides symmetric and anti-symmetric matrices. 
I have read the manual, the answer seems to be no. Am I right?

See the SBAIJ format (it is sparse).

With a parallel dense matrix, there isn't any point using a symmetric format 
unless you use a different distribution of the entries.



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener
-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<http://lists.mcs.anl.gov/pipermail/petsc-users/attachments/20120322/ae681ffb/attachment-0001.htm>

Reply via email to