There are certain other matrix operations such as solve LU/Cholesky factored 
matrix, matrix multiplications which are much faster with MAGMA. potrf, getrf, 
potrs, getrs and gemm all have the same calling sequence, so I don't think we 
need a separate magma matrix class for these. You are right, it doesn't make 
sense to make a magma matrix class. I will try to put it in dense.c.



----- Original Message -----
From: "Barry Smith" <[email protected]>
To: "Harshad Sahasrabudhe" <[email protected]>
Cc: "For users of the development version of PETSc" <[email protected]>
Sent: Tuesday, August 20, 2013 7:10:40 PM
Subject: Re: [petsc-dev] Adding Seq Dense MAGMA Mat


   How general do you plan for this "magma" matrix class to be? If it all it is 
for is to do LU/Cholesky factorizations then you do NOT need to introduce 
src/mat/impls/dense/seq/magma at all. Simply use the MatGetFactor() mechanism 
to "How will the user be able to access this function?".   This is definitely 
the easy way to go.

   If you want to use the "magma" matrix class for all kinds of 
non-factorization matrix operations then you need to write a full magma matrix 
class. I don't recommend this.

   Barry

On Aug 20, 2013, at 1:35 PM, Harshad Sahasrabudhe <[email protected]> wrote:

> Hi,
> 
> I am working on adding the Seq Dense MAGMA matrix type. For this, I have 
> created the following files and directories:
> 
> src/mat/impls/dense/seq/magma
> src/mat/impls/dense/seq/magma/magma.c
> src/mat/impls/dense/seq/magma/makefile
> 
> Right now I am just trying the make LU factorization work through MAGMA. 
> Barry had suggested looking at dense.c, as the MAGMA function for LU has the 
> same sequence as the LAPACK function getrf. MAGMA does the memory allocation 
> on the GPU inside it's function magma_?getrf. So can I directly use the 
> matrix type declared in dense.c, and just define a function 
> MatLUFactor_SeqDense_MAGMA magma.c which uses the same matrix type?
> 
> How will the user be able to access this function?
> 
> Thanks,
> Harshad

Reply via email to