Fande,

    I think you should just use AIJ, all the algorithms MatMult, MatFactor, 
MatSolve when the matrix is diagonal are order n work with a relatively small 
constant, and the overhead of using AIJ instead of a custom format is probably 
at most a factor of three  and since work is order n and it is a small constant 
any gain would be lost in the much bigger constants for the rest of the 
computation. 

   Barry

I know Rich doesn't have unlimited money and suspect spending it on almost 
anything else (like improving the load balancing in libMesh) will pay off far 
far more.


> On Feb 14, 2018, at 8:29 PM, Jed Brown <j...@jedbrown.org> wrote:
> 
> Fande Kong <fdkong...@gmail.com> writes:
> 
>> On Wed, Feb 14, 2018 at 4:35 PM, Smith, Barry F. <bsm...@mcs.anl.gov> wrote:
>> 
>>> 
>>>  What are you doing with the matrix?
>>> 
>> 
>> We are doing an explicit method. PDEs are discretized using a finite
>> element method, so there is a mass matrix. The mass matrix will be lumped,
>> and it becomes diagonal. We want to compute the inverse of the lumped
>> matrix, and also do a few of matrix-vector multiplications using  the
>> lumped matrix or  its inverse.
>> 
>> The specific implementation won't make this more efficient?
> 
> You can use pretty much any representation and you won't notice the time
> because you still have to apply the RHS operator and that is vastly more
> expensive.

Reply via email to