Thanks Barry. 

Your message provides a good starting point. I will need to do more reading to 
understand my options. 

Thanks,
Manav



> On Apr 7, 2015, at 3:00 PM, Barry Smith <[email protected]> wrote:
> 
> 
>   You cannot just use algebraic multigrid directly on the transonic Euler 
> flow problem discretized with SUPG you must break the problem into pieces 
> (probably using PCFIELDSPLIT) and then use appropriate preconditioners for 
> each piece. 
> 
>   Barry
> 
> Note that ILU is a bottom feeder preconditioner, it is only working because 
> you are using a huge fill factor (so it is a lot like a direct solver) and 
> even then it is working really poorly. You should google for good 
> preconditioners for transonic Euler flow  and SUPG (and ignore anything that 
> mentions ILU) to get a handle on how PCFIELDSPLIT could be used for your 
> problem.
> 
> 
>> On Apr 7, 2015, at 7:06 AM, Manav Bhatia <[email protected]> wrote:
>> 
>> Hi,
>> 
>>   I am solving a transonic Euler flow problem discretized with SUPG. The 
>> mesh is made of Tet4 elements and there are about 7M dofs, which I am trying 
>> to solve over 192 cores. 
>> 
>>   I had earlier written about the linear solver returning with Inf, and have 
>> since moved beyond that such that I am able to get a solution with the 
>> following command line parameters:
>> 
>> -ksp_gmres_restart 100  -pc_type asm -sub_pc_type ilu -sub_pc_factor_levels 
>> 4 -sub_ksp_type preonly
>> 
>>   The problem now is that I am limited to very small time-steps (~10e-6), 
>> outside of which the solver starts to choke and the solution diverges. I 
>> have tried other command line options and this above combination is what 
>> seems to provide the best possible solution strategy, albeit constrained in 
>> time steps. 
>> 
>>    I would appreciate any recommendations on solver configurations that may 
>> allow me to take larger time-steps. 
>> 
>>    Would coding up AMG be a possible alternative (more robust and 
>> dependable)? 
>> 
>>    Please let me know if you need additional information. 
>> 
>> Regards,
>> Manav
>> 
> 

Reply via email to