> On Oct 21, 2019, at 8:55 PM, Matthew Knepley via petsc-users 
> <[email protected]> wrote:
> 
> On Mon, Oct 21, 2019 at 9:47 PM Zhang, Hong <[email protected]> wrote:
> Matt:
> On Mon, Oct 21, 2019 at 9:30 PM Zhang, Hong via petsc-users 
> <[email protected]> wrote:
> Shash,
> You may do it without using pc. See petsc/src/mat/examples/tests/ex125.c
> 
> Does this example let you invert the matrix and just apply to get the solve?
> No. It gets LU factors stored in F, then call MatSolve(F, b, x)  multiple 
> times to get solution x for given b.
> 
> Do we have anything in PETSc now that gets the inverse? Storing and applying 
> the inverse can be much faster
> than using the factors (it's done in supernodal methods, and it our patch 
> smoothers). If not, we should talk about
> adding it.

   This is done in pbjaocobi. 

   There was some discussion in the most recent patch MR to try to provide a 
common API for "small" block factorization/solve that supported batching and 
other modern things. This would be a distinct construct from KSP and PC. I 
don't there is even a name yet. As it is done it should also be done for GPUs.

  Barry

> 
>   Thanks,
> 
>      Matt
>  
> Hong
> 
> On Mon, Oct 21, 2019 at 4:43 PM Shashwat Sharma via petsc-users 
> <[email protected]> wrote:
> Hello,
> 
> For some small matrices, I'd like to use Petsc to perform direct LU 
> factorization on a sequential dense or AIJ matrix, and then use the factored 
> matrix later on via MatSolve. This occurs multiple times in my code, and each 
> factored matrix is in turn used multiple times.
> 
> I tried to wrap this factorization process in a function, which should return 
> the factored matrix, as follows:
> 
> void MatFactorize_Petsc(Mat &mat, Mat &mat_factored)
> {
> PC pc;
> PCCreate(MPI_COMM_SELF, &pc);
> PCSetOperators(pc, mat, mat);
> PCSetType(pc, PCLU);
> 
> PCFactorSetMatSolverType(pc, MATSOLVERPETSC); // Or SuperLU
> PCFactorSetUpMatSolverType(pc);
> 
> PCFactorGetMatrix(pc, &mat_factored);
> PCSetUp(pc);
> // PCDestroy(&pc);
> return;
> }
> 
> The command PCDestroy causes a segmentation fault, which I think happens 
> because retrieving the factored matrix does not increase the reference count. 
> Looking at the Petsc source, it basically returns a pointer to pc->data. So 
> if I want to use mat_factored outside the function, I cannot destroy the PC 
> object, which leads to memory leaks (as per Valgrind) even if I later call 
> MatDestroy(&mat_factored).
> 
> I tried using a temp matrix to get the factors, and then doing 
> MatDuplicate(temp, MAT_COPY_VALUES, &mat_factored), but MatDuplicate is not 
> allowed on factored matrices; same for MatConvert.
> 
> Is there a way I can achieve the desired behaviour, where mat_factored is not 
> "linked" to the pc or ksp object, keeping in mind that I'd like to be able to 
> choose between SuperLU, Petsc and SuperLU_dist solvers?
> 
> Thanks,
> Shash
> 
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/
> 
> 
> -- 
> What most experimenters take for granted before they begin their experiments 
> is infinitely more interesting than any results to which their experiments 
> lead.
> -- Norbert Wiener
> 
> https://www.cse.buffalo.edu/~knepley/

Reply via email to