> On 11 Oct 2023, at 9:13 AM, Thanasis Boutsikakis
> wrote:
>
> Very good catch Pierre, thanks a lot!
>
> This made everything work: the two-step process and the ptap(). I mistakenly
> thought that I should not let the local number of columns to be None, since
> the matrix is only
Very good catch Pierre, thanks a lot!
This made everything work: the two-step process and the ptap(). I mistakenly
thought that I should not let the local number of columns to be None, since the
matrix is only partitioned row-wise. Could you please explain what happened
because of my setting
That’s because:
size = ((None, global_rows), (global_cols, global_cols))
should be:
size = ((None, global_rows), (None, global_cols))
Then, it will work.
$ ~/repo/petsc/arch-darwin-c-debug-real/bin/mpirun -n 4 python3.12 test.py &&
echo $?
0
Thanks,
Pierre
> On 11 Oct 2023, at 8:58
Furthermore, I tried to perform the Galerkin projection in two steps by
substituting
> A_prime = A.ptap(Phi)
With
AL = Phi.transposeMatMult(A)
A_prime = AL.matMult(Phi)
And running this with 3 procs, results to the false creation of a matrix AL
that has 3 times bigger dimensions that it
Pierre, I see your point, but my experiment shows that it does not even run due
to size mismatch, so I don’t see how being sparse would change things here.
There must be some kind of problem with the parallel ptap(), because it does
run sequentially. In order to test that, I changed the flags
I disagree with what Mark and Matt are saying: your code is fine, the error
message is fine, petsc4py is fine (in this instance).
It’s not a typical use case of MatPtAP(), which is mostly designed for MatAIJ,
not MatDense.
On the one hand, in the MatDense case, indeed there will be a mismatch
This looks like a false positive or there is some subtle bug here that we
are not seeing.
Could this be the first time parallel PtAP has been used (and reported) in
petsc4py?
Mark
On Tue, Oct 10, 2023 at 8:27 PM Matthew Knepley wrote:
> On Tue, Oct 10, 2023 at 5:34 PM Thanasis Boutsikakis <
>
On Tue, Oct 10, 2023 at 5:34 PM Thanasis Boutsikakis <
thanasis.boutsika...@corintis.com> wrote:
> Hi all,
>
> Revisiting my code and the proposed solution from Pierre, I realized this
> works only in sequential. The reason is that PETSc partitions those
> matrices only row-wise, which leads to
Hi all,
Revisiting my code and the proposed solution from Pierre, I realized this works
only in sequential. The reason is that PETSc partitions those matrices only
row-wise, which leads to an error due to the mismatch between number of columns
of A (non-partitioned) and the number of rows of
This works Pierre. Amazing input, thanks a lot!
> On 5 Oct 2023, at 14:17, Pierre Jolivet wrote:
>
> Not a petsc4py expert here, but you may to try instead:
> A_prime = A.ptap(Phi)
>
> Thanks,
> Pierre
>
>> On 5 Oct 2023, at 2:02 PM, Thanasis Boutsikakis
>> wrote:
>>
>> Thanks Pierre! So I
Not a petsc4py expert here, but you may to try instead:
A_prime = A.ptap(Phi)
Thanks,
Pierre
> On 5 Oct 2023, at 2:02 PM, Thanasis Boutsikakis
> wrote:
>
> Thanks Pierre! So I tried this and got a segmentation fault. Is this supposed
> to work right off the bat or am I missing sth?
>
>
Thanks Pierre! So I tried this and got a segmentation fault. Is this supposed
to work right off the bat or am I missing sth?
[0]PETSC ERROR:
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably
How about using ptap which will use MatPtAP?
It will be more efficient (and it will help you bypass the issue).
Thanks,
Pierre
> On 5 Oct 2023, at 1:18 PM, Thanasis Boutsikakis
> wrote:
>
> Sorry, forgot function create_petsc_matrix()
>
> def create_petsc_matrix(input_array sparse=True):
>
Sorry, forgot function create_petsc_matrix()
def create_petsc_matrix(input_array sparse=True):
"""Create a PETSc matrix from an input_array
Args:
input_array (np array): Input array
partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
sparse
14 matches
Mail list logo