$PETSC_DIR/$PETSC_ARCH/lib/libpetsc*

> On May 26, 2015, at 6:52 PM, frank <[email protected]> wrote:
> 
> Hi Barry,
> 
> Thank you for your prompt reply.
> Which executable lib should I use ldd to check?
> 
> Thank you,
> Frank.
> 
> On 05/26/2015 02:41 PM, Barry Smith wrote:
>>> On May 26, 2015, at 4:18 PM, frank <[email protected]> wrote:
>>> 
>>> Hi
>>> 
>>> I am trying to use multigrid to solve a large sparse linear system. I use 
>>> Hypre boomeramg as the preconditioner. The code calling KSPSolve is 
>>> paralleled by MPI.
>>> 
>>> I want to set Hypre to use OpenMP. Here is what I did:
>>> * I downloaded and compiled Hypre through Petsc
>>> * I recompiled the Hypre with " --with-opemp ".
>>    Ok, you need to make sure that PETSc is linking against the OpenMP 
>> compiled version of hypre libraries. Use ldd on linux or otool -L on Mac.
>> 
>>> * I set "-pc_type hypre" and "-pc_type_type boomeramg" for Petsc.
>>> 
>>> My question:
>>> ? In this way, would Hypre use OpenMP to parallel the execution when 
>>> KSPSolve is called ?
>>> ? If this does not work, is there another way I can set Hypre to use OpenMP 
>>> under Petsc ?
>>> ? Is there a way I can know explicitly whether Hypre is using OpenMP under 
>>> Petsc or not ?
>>    Your question really has little to do with PETSc and more to do with 
>> hypre. You need to look through the hypre documentation and find out how you 
>> control the number of OpenMP threads that hypre uses (likely it is some 
>> environmental variables).  Then run varying this number of threads and see 
>> what happens, if you use more threads does it go faster?  It is best to make 
>> this test with just a single MPI process and 1,2,4, 8 OpenMP threads.
>> 
>>   Barry
>> 
>> 
>> 
>> 
>>    
>> 
>>> Thank you so much
>>> Frank
>>> 
>>> 
>>> 
>>> 
> 

Reply via email to