Please "reply all" include the list in the future.
"Ferrand, Jesus A." writes:
> Forgot to say thanks for the reply (my bad).
> Yes, I was indeed forgetting to pre-allocate the sparse matrices when doing
> them by hand (complacency seeded by MATLAB's zeros()). Thank you, Jed, and
> Jeremy,
Hello Stefano and thank you for your support.
I never used SLEPc before but I did this:
right after the matrix loading from file I added the following lines to my
shared tiny code
MatLoad(matrix, v);
EPS eps;
EPSCreate(PETSC_COMM_WORLD, );
EPSSetOperators( eps, matrix, NULL );
Il Mar 4 Gen 2022, 16:56 Marco Cisternino ha
scritto:
> Hello Mark,
>
> I analyzed the codes with valgrind, both the real code and the tiny one.
>
> I obviously used memcheck tool but with full leak check compiling the
> codes with debug info.
>
> Not considering OpenMPI events (I have no
Hi,
We’ve got some users of our GridPACK package that are trying to build on the
Cori machine at NERSC. GridPACK uses CMake for its build system and relies on
Jed Brown’s FindPETSc.cmake module, along with the FindPackageMultipass.cmake
module to identify PETSc. The tests for PETSc are
Hello Mark,
I analyzed the codes with valgrind, both the real code and the tiny one.
I obviously used memcheck tool but with full leak check compiling the codes
with debug info.
Not considering OpenMPI events (I have no wrappers on the machine I used for
the analysis), the real code gave zero