> On Oct 12, 2020, at 6:10 AM, Olivier Jamond <[email protected]> wrote:
> 
> Hi Barry,
> 
> Thanks for this work! I tried this branch with my code and sequential 
> matrices on a small case: it does work!
> 
> 
  Excellant. I will extend it to the parallel case and get it into our master 
release. 

  We'd be interested in hearing about your convergence and timing experiences 
when you run largish jobs (even sequentially) since this type of problem comes 
up relatively frequently and we do need a variety of solvers that can handle it 
while currently we do not have great tools for it.

   Barry

> Thanks a lot,
> Olivier
> 
> On 09/10/2020 03:50, Barry Smith wrote:
>> 
>>   Olivier,
>> 
>>     The branch barry/2020-10-08/invert-block-diagonal-aij contains an 
>> example src/mat/tests/ex178.c that shows how to compute inv(CC'). It works 
>> for SeqAIJ matrices.
>> 
>>     Please let us know if it works for you and then I will implement the 
>> parallel version.
>> 
>>   Barry
>> 
>> 
>>> On Oct 8, 2020, at 3:59 PM, Barry Smith <[email protected] 
>>> <mailto:[email protected]>> wrote:
>>> 
>>> 
>>>  Olivier
>>> 
>>>  I am working on extending the routines now and hopefully push a branch you 
>>> can try fairly soon.
>>> 
>>>  Barry
>>> 
>>> 
>>>> On Oct 8, 2020, at 3:07 PM, Jed Brown <[email protected] 
>>>> <mailto:[email protected]>> wrote:
>>>> 
>>>> Olivier Jamond <[email protected] <mailto:[email protected]>> 
>>>> writes:
>>>> 
>>>>>>   Given the structure of C it seems you should just explicitly construct 
>>>>>> Sp and use GAMG (or other preconditioners, even a direct solver) 
>>>>>> directly on Sp. Trying to avoid explicitly forming Sp will give you a 
>>>>>> much slower performing solving for what benefit? If C was just some 
>>>>>> generic monster than forming Sp might be unrealistic but in your case 
>>>>>> CCt is is block diagonal with tiny blocks which means (C*Ct)^(-1) is 
>>>>>> block diagonal with tiny blocks (the blocks are the inverses of the 
>>>>>> blocks of (C*Ct)).
>>>>>> 
>>>>>>    Sp = Ct*C  + Qt * S * Q = Ct*C  +  [I - Ct * (C*Ct)^(-1)*C] S [I - Ct 
>>>>>> * (C*Ct)^(-1)*C]
>>>>>> 
>>>>>> [Ct * (C*Ct)^(-1)*C] will again be block diagonal with slightly larger 
>>>>>> blocks.
>>>>>> 
>>>>>> You can do D = (C*Ct) with MatMatMult() then write custom code that zips 
>>>>>> through the diagonal blocks of D inverting all of them to get iD then 
>>>>>> use MatPtAP applied to C and iD to get Ct * (C*Ct)^(-1)*C then 
>>>>>> MatShift() to include the I then MatPtAP or MatRAR to get [I - Ct * 
>>>>>> (C*Ct)^(-1)*C] S [I - Ct * (C*Ct)^(-1)*C]  then finally MatAXPY() to get 
>>>>>> Sp. The complexity of each of the Mat operations is very low because of 
>>>>>> the absurdly simple structure of C and its descendants.   You might even 
>>>>>> be able to just use MUMPS to give you the explicit inv(C*Ct) without 
>>>>>> writing custom code to get iD.
>>>>> 
>>>>> At this time, I didn't manage to compute iD=inv(C*Ct) without using 
>>>>> dense matrices, what may be a shame because all matrices are sparse . Is 
>>>>> it possible?
>>>>> 
>>>>> And I get no idea of how to write code to manually zip through the 
>>>>> diagonal blocks of D to invert them...
>>>> 
>>>> You could use MatInvertVariableBlockDiagonal(), which should perhaps 
>>>> return a Mat instead of a raw array.
>>>> 
>>>> If you have constant block sizes, MatInvertBlockDiagonalMat will return a 
>>>> Mat.
>>> 
>> 

Reply via email to