On Tue, Oct 10, 2017 at 10:30 PM, Evan Um <[email protected]> wrote: > Dear Hong, > > I just tried to check PETSC develeoper website but couldn't find the > updated dev version with name of hzang/update-mumps-5.1.1-cntl. >
Hong means a branch. You get the dev repository and checkout that branch git checkout hzang/update-mumps-5.1.1-cntl Thanks, Matt > . Could you please let me know the location of the updated dev version? > Where do I need to visit to check out the dev version with the new control > switches? Thank you very much for your help. > > Best, > Evan > > > > > > > On Tue, Oct 3, 2017 at 8:34 AM, Hong <[email protected]> wrote: > >> Evan, >> ICNTL(35) and CNTL(7) are added to petsc-mumps interface in branch >> hzhang/update-mumps-5.1.1-cntl >> >> You may give it a try. Once it passes our regression tests, I'll merge it >> to petsc master branch. >> >> Hong >> >> >> On Sun, Sep 24, 2017 at 8:08 PM, Hong <[email protected]> wrote: >> >>> I'll check it. >>> Hong >>> >>> On Sun, Sep 24, 2017 at 3:42 PM, Evan Um <[email protected]> wrote: >>> >>>> Hi Barry, >>>> >>>> Thanks for your comments. To activate block low rank (BLR) >>>> approximation in MUMPS version 5.1.1, a user needs to turn on the >>>> functionality (i.e. ICNTL(35)=1) and specify the tolerance value (e.g. >>>> CNTL(7)=1e-4). In PETSC, I think that we can set up ICNTL and CNTL >>>> parameters for MUMPS. I was wondering if we can still use BLR approximation >>>> for a preconditioner for Krylov solvers. >>>> >>>> Best, >>>> Evan >>>> >>>> >>>> On Sat, Sep 23, 2017 at 6:45 PM, Barry Smith <[email protected]> >>>> wrote: >>>> >>>>> >>>>> > On Sep 23, 2017, at 8:38 PM, Evan Um <[email protected]> wrote: >>>>> > >>>>> > Dear PETSC Users, >>>>> > >>>>> > My system matrix comes from finite element modeling and is complex >>>>> and unstructured. Its typical size is a few millions-by a few millions. I >>>>> wondering if I can use MUMPS parallel direct solver as a preconditioner in >>>>> PETSC. For example, I want to pass factored matrices to Krylov iterative >>>>> solvers such as QMR. Is there any PETSC+MUMPS example code for the >>>>> purpose? >>>>> >>>>> You don't pass factored matrices you just pass the original matrix >>>>> and use -pc_type lu -pc_factor_mat_solver_package mumps >>>>> >>>>> > Can PETSC call the latest MUMPS that supports block low rank >>>>> approximation? >>>>> >>>>> No, send us info on it and we'll see if we can add an interface >>>>> >>>>> >>>>> > >>>>> > In advance, thank you very much for your comments. >>>>> > >>>>> > Best, >>>>> > Evan >>>>> > >>>>> > >>>>> > >>>>> > >>>>> > >>>>> >>>>> >>>> >>> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener https://www.cse.buffalo.edu/~knepley/ <http://www.caam.rice.edu/~mk51/>
