Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Pierre Jolivet

> On 11 Oct 2023, at 9:13 AM, Thanasis Boutsikakis 
>  wrote:
> 
> Very good catch Pierre, thanks a lot!
> 
> This made everything work: the two-step process and the ptap(). I mistakenly 
> thought that I should not let the local number of columns to be None, since 
> the matrix is only partitioned row-wise. Could you please explain what 
> happened because of my setting the local column number so that I get the 
> philosophy behind this partitioning?

Hopefully this should make things clearer to you: 
https://petsc.org/release/manual/mat/#sec-matlayout

Thanks,
Pierre

> Thanks again,
> Thanos
> 
>> On 11 Oct 2023, at 09:04, Pierre Jolivet  wrote:
>> 
>> That’s because:
>> size = ((None, global_rows), (global_cols, global_cols)) 
>> should be:
>> size = ((None, global_rows), (None, global_cols)) 
>> Then, it will work.
>> $ ~/repo/petsc/arch-darwin-c-debug-real/bin/mpirun -n 4 python3.12 test.py 
>> && echo $?
>> 0
>> 
>> Thanks,
>> Pierre
>> 
>>> On 11 Oct 2023, at 8:58 AM, Thanasis Boutsikakis 
>>>  wrote:
>>> 
>>> Pierre, I see your point, but my experiment shows that it does not even run 
>>> due to size mismatch, so I don’t see how being sparse would change things 
>>> here. There must be some kind of problem with the parallel ptap(), because 
>>> it does run sequentially. In order to test that, I changed the flags of the 
>>> matrix creation to sparse=True and ran it again. Here is the code
>>> 
>>> """Experimenting with PETSc mat-mat multiplication"""
>>> 
>>> import numpy as np
>>> from firedrake import COMM_WORLD
>>> from firedrake.petsc import PETSc
>>> 
>>> from utilities import Print
>>> 
>>> nproc = COMM_WORLD.size
>>> rank = COMM_WORLD.rank
>>> 
>>> 
>>> def create_petsc_matrix(input_array, sparse=True):
>>> """Create a PETSc matrix from an input_array
>>> 
>>> Args:
>>> input_array (np array): Input array
>>> partition_like (PETSc mat, optional): Petsc matrix. Defaults to 
>>> None.
>>> sparse (bool, optional): Toggle for sparese or dense. Defaults to 
>>> True.
>>> 
>>> Returns:
>>> PETSc mat: PETSc mpi matrix
>>> """
>>> # Check if input_array is 1D and reshape if necessary
>>> assert len(input_array.shape) == 2, "Input array should be 
>>> 2-dimensional"
>>> global_rows, global_cols = input_array.shape
>>> size = ((None, global_rows), (global_cols, global_cols))
>>> 
>>> # Create a sparse or dense matrix based on the 'sparse' argument
>>> if sparse:
>>> matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
>>> else:
>>> matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
>>> matrix.setUp()
>>> 
>>> local_rows_start, local_rows_end = matrix.getOwnershipRange()
>>> 
>>> for counter, i in enumerate(range(local_rows_start, local_rows_end)):
>>> # Calculate the correct row in the array for the current process
>>> row_in_array = counter + local_rows_start
>>> matrix.setValues(
>>> i, range(global_cols), input_array[row_in_array, :], addv=False
>>> )
>>> 
>>> # Assembly the matrix to compute the final structure
>>> matrix.assemblyBegin()
>>> matrix.assemblyEnd()
>>> 
>>> return matrix
>>> 
>>> 
>>> # 
>>> # EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc 
>>> matrix Phi
>>> #  A' = Phi.T * A * Phi
>>> # [k x k] <- [k x m] x [m x m] x [m x k]
>>> # 
>>> 
>>> m, k = 100, 7
>>> # Generate the random numpy matrices
>>> np.random.seed(0)  # sets the seed to 0
>>> A_np = np.random.randint(low=0, high=6, size=(m, m))
>>> Phi_np = np.random.randint(low=0, high=6, size=(m, k))
>>> 
>>> # 
>>> # TEST: Galerking projection of numpy matrices A_np and Phi_np
>>> # 
>>> Aprime_np = Phi_np.T @ A_np @ Phi_np
>>> 
>>> # Create A as an mpi matrix distributed on each process
>>> A = create_petsc_matrix(A_np, sparse=True)
>>> 
>>> # Create Phi as an mpi matrix distributed on each process
>>> Phi = create_petsc_matrix(Phi_np, sparse=True)
>>> 
>>> # Create an empty PETSc matrix object to store the result of the PtAP 
>>> operation.
>>> # This will hold the result A' = Phi.T * A * Phi after the computation.
>>> A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=True)
>>> 
>>> # Perform the PtAP (Phi Transpose times A times Phi) operation.
>>> # In mathematical terms, this operation is A' = Phi.T * A * Phi.
>>> # A_prime will store the result of the operation.
>>> A_prime = A.ptap(Phi)
>>> 
>>> I got
>>> 
>>> Traceback (most recent call last):
>>>   File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", 
>>> line 89, in 
>>> Traceback (most recent call last):
>>>   File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", 
>>> line 89, in 
>>> Traceback (most recent call last):
>>>   File 

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Thanasis Boutsikakis
Very good catch Pierre, thanks a lot!

This made everything work: the two-step process and the ptap(). I mistakenly 
thought that I should not let the local number of columns to be None, since the 
matrix is only partitioned row-wise. Could you please explain what happened 
because of my setting the local column number so that I get the philosophy 
behind this partitioning?

Thanks again,
Thanos

> On 11 Oct 2023, at 09:04, Pierre Jolivet  wrote:
> 
> That’s because:
> size = ((None, global_rows), (global_cols, global_cols)) 
> should be:
> size = ((None, global_rows), (None, global_cols)) 
> Then, it will work.
> $ ~/repo/petsc/arch-darwin-c-debug-real/bin/mpirun -n 4 python3.12 test.py && 
> echo $?
> 0
> 
> Thanks,
> Pierre
> 
>> On 11 Oct 2023, at 8:58 AM, Thanasis Boutsikakis 
>>  wrote:
>> 
>> Pierre, I see your point, but my experiment shows that it does not even run 
>> due to size mismatch, so I don’t see how being sparse would change things 
>> here. There must be some kind of problem with the parallel ptap(), because 
>> it does run sequentially. In order to test that, I changed the flags of the 
>> matrix creation to sparse=True and ran it again. Here is the code
>> 
>> """Experimenting with PETSc mat-mat multiplication"""
>> 
>> import numpy as np
>> from firedrake import COMM_WORLD
>> from firedrake.petsc import PETSc
>> 
>> from utilities import Print
>> 
>> nproc = COMM_WORLD.size
>> rank = COMM_WORLD.rank
>> 
>> 
>> def create_petsc_matrix(input_array, sparse=True):
>> """Create a PETSc matrix from an input_array
>> 
>> Args:
>> input_array (np array): Input array
>> partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
>> sparse (bool, optional): Toggle for sparese or dense. Defaults to 
>> True.
>> 
>> Returns:
>> PETSc mat: PETSc mpi matrix
>> """
>> # Check if input_array is 1D and reshape if necessary
>> assert len(input_array.shape) == 2, "Input array should be 2-dimensional"
>> global_rows, global_cols = input_array.shape
>> size = ((None, global_rows), (global_cols, global_cols))
>> 
>> # Create a sparse or dense matrix based on the 'sparse' argument
>> if sparse:
>> matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
>> else:
>> matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
>> matrix.setUp()
>> 
>> local_rows_start, local_rows_end = matrix.getOwnershipRange()
>> 
>> for counter, i in enumerate(range(local_rows_start, local_rows_end)):
>> # Calculate the correct row in the array for the current process
>> row_in_array = counter + local_rows_start
>> matrix.setValues(
>> i, range(global_cols), input_array[row_in_array, :], addv=False
>> )
>> 
>> # Assembly the matrix to compute the final structure
>> matrix.assemblyBegin()
>> matrix.assemblyEnd()
>> 
>> return matrix
>> 
>> 
>> # 
>> # EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc matrix 
>> Phi
>> #  A' = Phi.T * A * Phi
>> # [k x k] <- [k x m] x [m x m] x [m x k]
>> # 
>> 
>> m, k = 100, 7
>> # Generate the random numpy matrices
>> np.random.seed(0)  # sets the seed to 0
>> A_np = np.random.randint(low=0, high=6, size=(m, m))
>> Phi_np = np.random.randint(low=0, high=6, size=(m, k))
>> 
>> # 
>> # TEST: Galerking projection of numpy matrices A_np and Phi_np
>> # 
>> Aprime_np = Phi_np.T @ A_np @ Phi_np
>> 
>> # Create A as an mpi matrix distributed on each process
>> A = create_petsc_matrix(A_np, sparse=True)
>> 
>> # Create Phi as an mpi matrix distributed on each process
>> Phi = create_petsc_matrix(Phi_np, sparse=True)
>> 
>> # Create an empty PETSc matrix object to store the result of the PtAP 
>> operation.
>> # This will hold the result A' = Phi.T * A * Phi after the computation.
>> A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=True)
>> 
>> # Perform the PtAP (Phi Transpose times A times Phi) operation.
>> # In mathematical terms, this operation is A' = Phi.T * A * Phi.
>> # A_prime will store the result of the operation.
>> A_prime = A.ptap(Phi)
>> 
>> I got
>> 
>> Traceback (most recent call last):
>>   File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", 
>> line 89, in 
>> Traceback (most recent call last):
>>   File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", 
>> line 89, in 
>> Traceback (most recent call last):
>>   File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", 
>> line 89, in 
>> A_prime = A.ptap(Phi)
>> A_prime = A.ptap(Phi)
>>   ^^^
>>   File "petsc4py/PETSc/Mat.pyx", line 1525, in petsc4py.PETSc.Mat.ptap
>> A_prime = A.ptap(Phi)
>>   ^^^
>>   ^^^
>>   File 

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Pierre Jolivet
That’s because:
size = ((None, global_rows), (global_cols, global_cols)) 
should be:
size = ((None, global_rows), (None, global_cols)) 
Then, it will work.
$ ~/repo/petsc/arch-darwin-c-debug-real/bin/mpirun -n 4 python3.12 test.py && 
echo $?
0

Thanks,
Pierre

> On 11 Oct 2023, at 8:58 AM, Thanasis Boutsikakis 
>  wrote:
> 
> Pierre, I see your point, but my experiment shows that it does not even run 
> due to size mismatch, so I don’t see how being sparse would change things 
> here. There must be some kind of problem with the parallel ptap(), because it 
> does run sequentially. In order to test that, I changed the flags of the 
> matrix creation to sparse=True and ran it again. Here is the code
> 
> """Experimenting with PETSc mat-mat multiplication"""
> 
> import numpy as np
> from firedrake import COMM_WORLD
> from firedrake.petsc import PETSc
> 
> from utilities import Print
> 
> nproc = COMM_WORLD.size
> rank = COMM_WORLD.rank
> 
> 
> def create_petsc_matrix(input_array, sparse=True):
> """Create a PETSc matrix from an input_array
> 
> Args:
> input_array (np array): Input array
> partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
> sparse (bool, optional): Toggle for sparese or dense. Defaults to 
> True.
> 
> Returns:
> PETSc mat: PETSc mpi matrix
> """
> # Check if input_array is 1D and reshape if necessary
> assert len(input_array.shape) == 2, "Input array should be 2-dimensional"
> global_rows, global_cols = input_array.shape
> size = ((None, global_rows), (global_cols, global_cols))
> 
> # Create a sparse or dense matrix based on the 'sparse' argument
> if sparse:
> matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
> else:
> matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
> matrix.setUp()
> 
> local_rows_start, local_rows_end = matrix.getOwnershipRange()
> 
> for counter, i in enumerate(range(local_rows_start, local_rows_end)):
> # Calculate the correct row in the array for the current process
> row_in_array = counter + local_rows_start
> matrix.setValues(
> i, range(global_cols), input_array[row_in_array, :], addv=False
> )
> 
> # Assembly the matrix to compute the final structure
> matrix.assemblyBegin()
> matrix.assemblyEnd()
> 
> return matrix
> 
> 
> # 
> # EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc matrix 
> Phi
> #  A' = Phi.T * A * Phi
> # [k x k] <- [k x m] x [m x m] x [m x k]
> # 
> 
> m, k = 100, 7
> # Generate the random numpy matrices
> np.random.seed(0)  # sets the seed to 0
> A_np = np.random.randint(low=0, high=6, size=(m, m))
> Phi_np = np.random.randint(low=0, high=6, size=(m, k))
> 
> # 
> # TEST: Galerking projection of numpy matrices A_np and Phi_np
> # 
> Aprime_np = Phi_np.T @ A_np @ Phi_np
> 
> # Create A as an mpi matrix distributed on each process
> A = create_petsc_matrix(A_np, sparse=True)
> 
> # Create Phi as an mpi matrix distributed on each process
> Phi = create_petsc_matrix(Phi_np, sparse=True)
> 
> # Create an empty PETSc matrix object to store the result of the PtAP 
> operation.
> # This will hold the result A' = Phi.T * A * Phi after the computation.
> A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=True)
> 
> # Perform the PtAP (Phi Transpose times A times Phi) operation.
> # In mathematical terms, this operation is A' = Phi.T * A * Phi.
> # A_prime will store the result of the operation.
> A_prime = A.ptap(Phi)
> 
> I got
> 
> Traceback (most recent call last):
>   File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", line 
> 89, in 
> Traceback (most recent call last):
>   File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", line 
> 89, in 
> Traceback (most recent call last):
>   File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", line 
> 89, in 
> A_prime = A.ptap(Phi)
> A_prime = A.ptap(Phi)
>   ^^^
>   File "petsc4py/PETSc/Mat.pyx", line 1525, in petsc4py.PETSc.Mat.ptap
> A_prime = A.ptap(Phi)
>   ^^^
>   ^^^
>   File "petsc4py/PETSc/Mat.pyx", line 1525, in petsc4py.PETSc.Mat.ptap
>   File "petsc4py/PETSc/Mat.pyx", line 1525, in petsc4py.PETSc.Mat.ptap
> petsc4py.PETSc.Error: error code 60
> [0] MatPtAP() at 
> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9896
> [0] MatProductSetFromOptions() at 
> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:541
> [0] MatProductSetFromOptions_Private() at 
> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:435
> [0] MatProductSetFromOptions_MPIAIJ() at 
> 

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Thanasis Boutsikakis
Furthermore, I tried to perform the Galerkin projection in two steps by 
substituting

> A_prime = A.ptap(Phi)

With 

AL = Phi.transposeMatMult(A)
A_prime = AL.matMult(Phi)

And running this with 3 procs, results to the false creation of a matrix AL 
that has 3 times bigger dimensions that it should (A is of size 100x100 and Phi 
of size 100x7):

MATRIX mpiaij AL [21x300]
Assembled

Partitioning for AL:
  Rank 0: Rows [0, 7)
  Rank 1: Rows [7, 14)
  Rank 2: Rows [14, 21)

And naturally, in another dimension incompatibility:

Traceback (most recent call last):
  File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", line 
85, in 
Traceback (most recent call last):
  File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", line 
85, in 
A_prime = AL.matMult(Phi)
A_prime = AL.matMult(Phi)
  ^^^
  File "petsc4py/PETSc/Mat.pyx", line 1492, in petsc4py.PETSc.Mat.matMult
  ^^^
  File "petsc4py/PETSc/Mat.pyx", line 1492, in petsc4py.PETSc.Mat.matMult
Traceback (most recent call last):
  File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", line 
85, in 
petsc4py.PETSc.Error: error code 60
[2] MatMatMult() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:10053
[2] MatProduct_Private() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9976
[2] MatProductSetFromOptions() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:541
[2] MatProductSetFromOptions_Private() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:421
[2] Nonconforming object sizes
[2] Matrix dimensions of A and B are incompatible for MatProductType AB: A 
21x300, B 100x7
Abort(1) on node 2 (rank 2 in comm 496): application called 
MPI_Abort(PYOP2_COMM_WORLD, 1) - process 2
petsc4py.PETSc.Error: error code 60
[1] MatMatMult() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:10053
[1] MatProduct_Private() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9976
[1] MatProductSetFromOptions() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:541
[1] MatProductSetFromOptions_Private() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:421
[1] Nonconforming object sizes
[1] Matrix dimensions of A and B are incompatible for MatProductType AB: A 
21x300, B 100x7
Abort(1) on node 1 (rank 1 in comm 496): application called 
MPI_Abort(PYOP2_COMM_WORLD, 1) - process 1
A_prime = AL.matMult(Phi)
  ^^^
  File "petsc4py/PETSc/Mat.pyx", line 1492, in petsc4py.PETSc.Mat.matMult
petsc4py.PETSc.Error: error code 60
[0] MatMatMult() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:10053
[0] MatProduct_Private() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9976
[0] MatProductSetFromOptions() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:541
[0] MatProductSetFromOptions_Private() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:421
[0] Nonconforming object sizes
[0] Matrix dimensions of A and B are incompatible for MatProductType AB: A 
21x300, B 100x7
Abort(1) on node 0 (rank 0 in comm 496): application called 
MPI_Abort(PYOP2_COMM_WORLD, 1) - process 0

> On 10 Oct 2023, at 23:33, Thanasis Boutsikakis 
>  wrote:
> 
> Hi all,
> 
> Revisiting my code and the proposed solution from Pierre, I realized this 
> works only in sequential. The reason is that PETSc partitions those matrices 
> only row-wise, which leads to an error due to the mismatch between number of 
> columns of A (non-partitioned) and the number of rows of Phi (partitioned).
> 
> """Experimenting with PETSc mat-mat multiplication"""
> 
> import time
> 
> import numpy as np
> from colorama import Fore
> from firedrake import COMM_SELF, COMM_WORLD
> from firedrake.petsc import PETSc
> from mpi4py import MPI
> from numpy.testing import assert_array_almost_equal
> 
> from utilities import Print
> 
> nproc = COMM_WORLD.size
> rank = COMM_WORLD.rank
> 
> def create_petsc_matrix(input_array, sparse=True):
> """Create a PETSc matrix from an input_array
> 
> Args:
> input_array (np array): Input array
> partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
> sparse (bool, optional): Toggle for sparese or dense. Defaults to 
> True.
> 
> Returns:
> PETSc mat: PETSc mpi matrix
> """
> # Check if input_array is 1D and reshape if necessary
> assert len(input_array.shape) == 2, "Input array should be 2-dimensional"
> global_rows, global_cols = input_array.shape
> size = ((None, global_rows), (global_cols, global_cols))
> 
> # Create a sparse or dense matrix based on the 'sparse' argument
> if sparse:
> matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
> else:
> matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-11 Thread Thanasis Boutsikakis
Pierre, I see your point, but my experiment shows that it does not even run due 
to size mismatch, so I don’t see how being sparse would change things here. 
There must be some kind of problem with the parallel ptap(), because it does 
run sequentially. In order to test that, I changed the flags of the matrix 
creation to sparse=True and ran it again. Here is the code

"""Experimenting with PETSc mat-mat multiplication"""

import numpy as np
from firedrake import COMM_WORLD
from firedrake.petsc import PETSc

from utilities import Print

nproc = COMM_WORLD.size
rank = COMM_WORLD.rank


def create_petsc_matrix(input_array, sparse=True):
"""Create a PETSc matrix from an input_array

Args:
input_array (np array): Input array
partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
sparse (bool, optional): Toggle for sparese or dense. Defaults to True.

Returns:
PETSc mat: PETSc mpi matrix
"""
# Check if input_array is 1D and reshape if necessary
assert len(input_array.shape) == 2, "Input array should be 2-dimensional"
global_rows, global_cols = input_array.shape
size = ((None, global_rows), (global_cols, global_cols))

# Create a sparse or dense matrix based on the 'sparse' argument
if sparse:
matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
else:
matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
matrix.setUp()

local_rows_start, local_rows_end = matrix.getOwnershipRange()

for counter, i in enumerate(range(local_rows_start, local_rows_end)):
# Calculate the correct row in the array for the current process
row_in_array = counter + local_rows_start
matrix.setValues(
i, range(global_cols), input_array[row_in_array, :], addv=False
)

# Assembly the matrix to compute the final structure
matrix.assemblyBegin()
matrix.assemblyEnd()

return matrix


# 
# EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc matrix Phi
#  A' = Phi.T * A * Phi
# [k x k] <- [k x m] x [m x m] x [m x k]
# 

m, k = 100, 7
# Generate the random numpy matrices
np.random.seed(0)  # sets the seed to 0
A_np = np.random.randint(low=0, high=6, size=(m, m))
Phi_np = np.random.randint(low=0, high=6, size=(m, k))

# 
# TEST: Galerking projection of numpy matrices A_np and Phi_np
# 
Aprime_np = Phi_np.T @ A_np @ Phi_np

# Create A as an mpi matrix distributed on each process
A = create_petsc_matrix(A_np, sparse=True)

# Create Phi as an mpi matrix distributed on each process
Phi = create_petsc_matrix(Phi_np, sparse=True)

# Create an empty PETSc matrix object to store the result of the PtAP operation.
# This will hold the result A' = Phi.T * A * Phi after the computation.
A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=True)

# Perform the PtAP (Phi Transpose times A times Phi) operation.
# In mathematical terms, this operation is A' = Phi.T * A * Phi.
# A_prime will store the result of the operation.
A_prime = A.ptap(Phi)

I got

Traceback (most recent call last):
  File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", line 
89, in 
Traceback (most recent call last):
  File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", line 
89, in 
Traceback (most recent call last):
  File "/Users/boutsitron/petsc-experiments/mat_vec_multiplication2.py", line 
89, in 
A_prime = A.ptap(Phi)
A_prime = A.ptap(Phi)
  ^^^
  File "petsc4py/PETSc/Mat.pyx", line 1525, in petsc4py.PETSc.Mat.ptap
A_prime = A.ptap(Phi)
  ^^^
  ^^^
  File "petsc4py/PETSc/Mat.pyx", line 1525, in petsc4py.PETSc.Mat.ptap
  File "petsc4py/PETSc/Mat.pyx", line 1525, in petsc4py.PETSc.Mat.ptap
petsc4py.PETSc.Error: error code 60
[0] MatPtAP() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9896
[0] MatProductSetFromOptions() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:541
[0] MatProductSetFromOptions_Private() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:435
[0] MatProductSetFromOptions_MPIAIJ() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/impls/aij/mpi/mpimatmatmult.c:2372
[0] MatProductSetFromOptions_MPIAIJ_PtAP() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/impls/aij/mpi/mpimatmatmult.c:2266
[0] Nonconforming object sizes
[0] Matrix local dimensions are incompatible, Acol (0, 100) != Prow (0,34)
Abort(1) on node 0 (rank 0 in comm 496): application called 
MPI_Abort(PYOP2_COMM_WORLD, 1) - process 0
petsc4py.PETSc.Error: error code 60
[1] MatPtAP() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9896
[1] MatProductSetFromOptions() at 

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-10 Thread Pierre Jolivet
I disagree with what Mark and Matt are saying: your code is fine, the error 
message is fine, petsc4py is fine (in this instance).
It’s not a typical use case of MatPtAP(), which is mostly designed for MatAIJ, 
not MatDense.
On the one hand, in the MatDense case, indeed there will be a mismatch between 
the number of columns of A and the number of rows of P, as written in the error 
message.
On the other hand, there is not much to optimize when computing C = P’ A P with 
everything being dense.
I would just write this as B = A P and then C = P’ B (but then you may face the 
same issue as initially reported, please let us know then).

Thanks,
Pierre

> On 11 Oct 2023, at 2:42 AM, Mark Adams  wrote:
> 
> This looks like a false positive or there is some subtle bug here that we are 
> not seeing.
> Could this be the first time parallel PtAP has been used (and reported) in 
> petsc4py?
> 
> Mark
> 
> On Tue, Oct 10, 2023 at 8:27 PM Matthew Knepley  > wrote:
>> On Tue, Oct 10, 2023 at 5:34 PM Thanasis Boutsikakis 
>> > > wrote:
>>> Hi all,
>>> 
>>> Revisiting my code and the proposed solution from Pierre, I realized this 
>>> works only in sequential. The reason is that PETSc partitions those 
>>> matrices only row-wise, which leads to an error due to the mismatch between 
>>> number of columns of A (non-partitioned) and the number of rows of Phi 
>>> (partitioned).
>> 
>> Are you positive about this? P^T A P is designed to run in this scenario, so 
>> either we have a bug or the diagnosis is wrong.
>> 
>>   Thanks,
>> 
>>  Matt
>>  
>>> """Experimenting with PETSc mat-mat multiplication"""
>>> 
>>> import time
>>> 
>>> import numpy as np
>>> from colorama import Fore
>>> from firedrake import COMM_SELF, COMM_WORLD
>>> from firedrake.petsc import PETSc
>>> from mpi4py import MPI
>>> from numpy.testing import assert_array_almost_equal
>>> 
>>> from utilities import Print
>>> 
>>> nproc = COMM_WORLD.size
>>> rank = COMM_WORLD.rank
>>> 
>>> def create_petsc_matrix(input_array, sparse=True):
>>> """Create a PETSc matrix from an input_array
>>> 
>>> Args:
>>> input_array (np array): Input array
>>> partition_like (PETSc mat, optional): Petsc matrix. Defaults to 
>>> None.
>>> sparse (bool, optional): Toggle for sparese or dense. Defaults to 
>>> True.
>>> 
>>> Returns:
>>> PETSc mat: PETSc mpi matrix
>>> """
>>> # Check if input_array is 1D and reshape if necessary
>>> assert len(input_array.shape) == 2, "Input array should be 
>>> 2-dimensional"
>>> global_rows, global_cols = input_array.shape
>>> size = ((None, global_rows), (global_cols, global_cols))
>>> 
>>> # Create a sparse or dense matrix based on the 'sparse' argument
>>> if sparse:
>>> matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
>>> else:
>>> matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
>>> matrix.setUp()
>>> 
>>> local_rows_start, local_rows_end = matrix.getOwnershipRange()
>>> 
>>> for counter, i in enumerate(range(local_rows_start, local_rows_end)):
>>> # Calculate the correct row in the array for the current process
>>> row_in_array = counter + local_rows_start
>>> matrix.setValues(
>>> i, range(global_cols), input_array[row_in_array, :], addv=False
>>> )
>>> 
>>> # Assembly the matrix to compute the final structure
>>> matrix.assemblyBegin()
>>> matrix.assemblyEnd()
>>> 
>>> return matrix
>>> 
>>> # 
>>> # EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc 
>>> matrix Phi
>>> #  A' = Phi.T * A * Phi
>>> # [k x k] <- [k x m] x [m x m] x [m x k]
>>> # 
>>> 
>>> m, k = 100, 7
>>> # Generate the random numpy matrices
>>> np.random.seed(0)  # sets the seed to 0
>>> A_np = np.random.randint(low=0, high=6, size=(m, m))
>>> Phi_np = np.random.randint(low=0, high=6, size=(m, k))
>>> 
>>> # 
>>> # TEST: Galerking projection of numpy matrices A_np and Phi_np
>>> # 
>>> Aprime_np = Phi_np.T @ A_np @ Phi_np
>>> Print(f"MATRIX Aprime_np [{Aprime_np.shape[0]}x{Aprime_np.shape[1]}]")
>>> Print(f"{Aprime_np}")
>>> 
>>> # Create A as an mpi matrix distributed on each process
>>> A = create_petsc_matrix(A_np, sparse=False)
>>> 
>>> # Create Phi as an mpi matrix distributed on each process
>>> Phi = create_petsc_matrix(Phi_np, sparse=False)
>>> 
>>> # Create an empty PETSc matrix object to store the result of the PtAP 
>>> operation.
>>> # This will hold the result A' = Phi.T * A * Phi after the computation.
>>> A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=False)
>>> 
>>> # Perform the PtAP (Phi Transpose times A times Phi) operation.
>>> # In mathematical terms, this operation is A' = Phi.T * A * Phi.
>>> # 

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-10 Thread Mark Adams
This looks like a false positive or there is some subtle bug here that we
are not seeing.
Could this be the first time parallel PtAP has been used (and reported) in
petsc4py?

Mark

On Tue, Oct 10, 2023 at 8:27 PM Matthew Knepley  wrote:

> On Tue, Oct 10, 2023 at 5:34 PM Thanasis Boutsikakis <
> thanasis.boutsika...@corintis.com> wrote:
>
>> Hi all,
>>
>> Revisiting my code and the proposed solution from Pierre, I realized this
>> works only in sequential. The reason is that PETSc partitions those
>> matrices only row-wise, which leads to an error due to the mismatch between
>> number of columns of A (non-partitioned) and the number of rows of Phi
>> (partitioned).
>>
>
> Are you positive about this? P^T A P is designed to run in this scenario,
> so either we have a bug or the diagnosis is wrong.
>
>   Thanks,
>
>  Matt
>
>
>> """Experimenting with PETSc mat-mat multiplication"""
>>
>> import time
>>
>> import numpy as np
>> from colorama import Fore
>> from firedrake import COMM_SELF, COMM_WORLD
>> from firedrake.petsc import PETSc
>> from mpi4py import MPI
>> from numpy.testing import assert_array_almost_equal
>>
>> from utilities import Print
>>
>> nproc = COMM_WORLD.size
>> rank = COMM_WORLD.rank
>>
>> def create_petsc_matrix(input_array, sparse=True):
>> """Create a PETSc matrix from an input_array
>>
>> Args:
>> input_array (np array): Input array
>> partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
>> sparse (bool, optional): Toggle for sparese or dense. Defaults to True.
>>
>> Returns:
>> PETSc mat: PETSc mpi matrix
>> """
>> # Check if input_array is 1D and reshape if necessary
>> assert len(input_array.shape) == 2, "Input array should be 2-dimensional"
>> global_rows, global_cols = input_array.shape
>> size = ((None, global_rows), (global_cols, global_cols))
>>
>> # Create a sparse or dense matrix based on the 'sparse' argument
>> if sparse:
>> matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
>> else:
>> matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
>> matrix.setUp()
>>
>> local_rows_start, local_rows_end = matrix.getOwnershipRange()
>>
>> for counter, i in enumerate(range(local_rows_start, local_rows_end)):
>> # Calculate the correct row in the array for the current process
>> row_in_array = counter + local_rows_start
>> matrix.setValues(
>> i, range(global_cols), input_array[row_in_array, :], addv=False
>> )
>>
>> # Assembly the matrix to compute the final structure
>> matrix.assemblyBegin()
>> matrix.assemblyEnd()
>>
>> return matrix
>>
>> # 
>> # EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc
>> matrix Phi
>> # A' = Phi.T * A * Phi
>> # [k x k] <- [k x m] x [m x m] x [m x k]
>> # 
>>
>> m, k = 100, 7
>> # Generate the random numpy matrices
>> np.random.seed(0) # sets the seed to 0
>> A_np = np.random.randint(low=0, high=6, size=(m, m))
>> Phi_np = np.random.randint(low=0, high=6, size=(m, k))
>>
>> # 
>> # TEST: Galerking projection of numpy matrices A_np and Phi_np
>> # 
>> Aprime_np = Phi_np.T @ A_np @ Phi_np
>> Print(f"MATRIX Aprime_np [{Aprime_np.shape[0]}x{Aprime_np.shape[1]}]")
>> Print(f"{Aprime_np}")
>>
>> # Create A as an mpi matrix distributed on each process
>> A = create_petsc_matrix(A_np, sparse=False)
>>
>> # Create Phi as an mpi matrix distributed on each process
>> Phi = create_petsc_matrix(Phi_np, sparse=False)
>>
>> # Create an empty PETSc matrix object to store the result of the PtAP
>> operation.
>> # This will hold the result A' = Phi.T * A * Phi after the computation.
>> A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=False)
>>
>> # Perform the PtAP (Phi Transpose times A times Phi) operation.
>> # In mathematical terms, this operation is A' = Phi.T * A * Phi.
>> # A_prime will store the result of the operation.
>> A_prime = A.ptap(Phi)
>>
>> Here is the error
>>
>> MATRIX mpiaij A [100x100]
>> Assembled
>>
>> Partitioning for A:
>>   Rank 0: Rows [0, 34)
>>   Rank 1: Rows [34, 67)
>>   Rank 2: Rows [67, 100)
>>
>> MATRIX mpiaij Phi [100x7]
>> Assembled
>>
>> Partitioning for Phi:
>>   Rank 0: Rows [0, 34)
>>   Rank 1: Rows [34, 67)
>>   Rank 2: Rows [67, 100)
>>
>> Traceback (most recent call last):
>>   File "/Users/boutsitron/work/galerkin_projection.py", line 87, in
>> 
>> A_prime = A.ptap(Phi)
>>   ^^^
>>   File "petsc4py/PETSc/Mat.pyx", line 1525, in petsc4py.PETSc.Mat.ptap
>> petsc4py.PETSc.Error: error code 60
>> [0] MatPtAP() at
>> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9896
>> [0] MatProductSetFromOptions() at
>> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:541
>> [0] MatProductSetFromOptions_Private() at
>> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:435
>> [0] MatProductSetFromOptions_MPIAIJ() at
>> 

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-10 Thread Matthew Knepley
On Tue, Oct 10, 2023 at 5:34 PM Thanasis Boutsikakis <
thanasis.boutsika...@corintis.com> wrote:

> Hi all,
>
> Revisiting my code and the proposed solution from Pierre, I realized this
> works only in sequential. The reason is that PETSc partitions those
> matrices only row-wise, which leads to an error due to the mismatch between
> number of columns of A (non-partitioned) and the number of rows of Phi
> (partitioned).
>

Are you positive about this? P^T A P is designed to run in this scenario,
so either we have a bug or the diagnosis is wrong.

  Thanks,

 Matt


> """Experimenting with PETSc mat-mat multiplication"""
>
> import time
>
> import numpy as np
> from colorama import Fore
> from firedrake import COMM_SELF, COMM_WORLD
> from firedrake.petsc import PETSc
> from mpi4py import MPI
> from numpy.testing import assert_array_almost_equal
>
> from utilities import Print
>
> nproc = COMM_WORLD.size
> rank = COMM_WORLD.rank
>
> def create_petsc_matrix(input_array, sparse=True):
> """Create a PETSc matrix from an input_array
>
> Args:
> input_array (np array): Input array
> partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
> sparse (bool, optional): Toggle for sparese or dense. Defaults to True.
>
> Returns:
> PETSc mat: PETSc mpi matrix
> """
> # Check if input_array is 1D and reshape if necessary
> assert len(input_array.shape) == 2, "Input array should be 2-dimensional"
> global_rows, global_cols = input_array.shape
> size = ((None, global_rows), (global_cols, global_cols))
>
> # Create a sparse or dense matrix based on the 'sparse' argument
> if sparse:
> matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
> else:
> matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
> matrix.setUp()
>
> local_rows_start, local_rows_end = matrix.getOwnershipRange()
>
> for counter, i in enumerate(range(local_rows_start, local_rows_end)):
> # Calculate the correct row in the array for the current process
> row_in_array = counter + local_rows_start
> matrix.setValues(
> i, range(global_cols), input_array[row_in_array, :], addv=False
> )
>
> # Assembly the matrix to compute the final structure
> matrix.assemblyBegin()
> matrix.assemblyEnd()
>
> return matrix
>
> # 
> # EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc
> matrix Phi
> # A' = Phi.T * A * Phi
> # [k x k] <- [k x m] x [m x m] x [m x k]
> # 
>
> m, k = 100, 7
> # Generate the random numpy matrices
> np.random.seed(0) # sets the seed to 0
> A_np = np.random.randint(low=0, high=6, size=(m, m))
> Phi_np = np.random.randint(low=0, high=6, size=(m, k))
>
> # 
> # TEST: Galerking projection of numpy matrices A_np and Phi_np
> # 
> Aprime_np = Phi_np.T @ A_np @ Phi_np
> Print(f"MATRIX Aprime_np [{Aprime_np.shape[0]}x{Aprime_np.shape[1]}]")
> Print(f"{Aprime_np}")
>
> # Create A as an mpi matrix distributed on each process
> A = create_petsc_matrix(A_np, sparse=False)
>
> # Create Phi as an mpi matrix distributed on each process
> Phi = create_petsc_matrix(Phi_np, sparse=False)
>
> # Create an empty PETSc matrix object to store the result of the PtAP
> operation.
> # This will hold the result A' = Phi.T * A * Phi after the computation.
> A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=False)
>
> # Perform the PtAP (Phi Transpose times A times Phi) operation.
> # In mathematical terms, this operation is A' = Phi.T * A * Phi.
> # A_prime will store the result of the operation.
> A_prime = A.ptap(Phi)
>
> Here is the error
>
> MATRIX mpiaij A [100x100]
> Assembled
>
> Partitioning for A:
>   Rank 0: Rows [0, 34)
>   Rank 1: Rows [34, 67)
>   Rank 2: Rows [67, 100)
>
> MATRIX mpiaij Phi [100x7]
> Assembled
>
> Partitioning for Phi:
>   Rank 0: Rows [0, 34)
>   Rank 1: Rows [34, 67)
>   Rank 2: Rows [67, 100)
>
> Traceback (most recent call last):
>   File "/Users/boutsitron/work/galerkin_projection.py", line 87, in
> 
> A_prime = A.ptap(Phi)
>   ^^^
>   File "petsc4py/PETSc/Mat.pyx", line 1525, in petsc4py.PETSc.Mat.ptap
> petsc4py.PETSc.Error: error code 60
> [0] MatPtAP() at
> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9896
> [0] MatProductSetFromOptions() at
> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:541
> [0] MatProductSetFromOptions_Private() at
> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:435
> [0] MatProductSetFromOptions_MPIAIJ() at
> /Users/boutsitron/firedrake/src/petsc/src/mat/impls/aij/mpi/mpimatmatmult.c:2372
> [0] MatProductSetFromOptions_MPIAIJ_PtAP() at
> /Users/boutsitron/firedrake/src/petsc/src/mat/impls/aij/mpi/mpimatmatmult.c:2266
> [0] Nonconforming object sizes
> [0] Matrix local dimensions are incompatible, Acol (0, 100) != Prow (0,34)
> Abort(1) on node 0 (rank 0 in comm 496): application called
> 

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-10 Thread Thanasis Boutsikakis
Hi all,

Revisiting my code and the proposed solution from Pierre, I realized this works 
only in sequential. The reason is that PETSc partitions those matrices only 
row-wise, which leads to an error due to the mismatch between number of columns 
of A (non-partitioned) and the number of rows of Phi (partitioned).

"""Experimenting with PETSc mat-mat multiplication"""

import time

import numpy as np
from colorama import Fore
from firedrake import COMM_SELF, COMM_WORLD
from firedrake.petsc import PETSc
from mpi4py import MPI
from numpy.testing import assert_array_almost_equal

from utilities import Print

nproc = COMM_WORLD.size
rank = COMM_WORLD.rank

def create_petsc_matrix(input_array, sparse=True):
"""Create a PETSc matrix from an input_array

Args:
input_array (np array): Input array
partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
sparse (bool, optional): Toggle for sparese or dense. Defaults to True.

Returns:
PETSc mat: PETSc mpi matrix
"""
# Check if input_array is 1D and reshape if necessary
assert len(input_array.shape) == 2, "Input array should be 2-dimensional"
global_rows, global_cols = input_array.shape
size = ((None, global_rows), (global_cols, global_cols))

# Create a sparse or dense matrix based on the 'sparse' argument
if sparse:
matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
else:
matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
matrix.setUp()

local_rows_start, local_rows_end = matrix.getOwnershipRange()

for counter, i in enumerate(range(local_rows_start, local_rows_end)):
# Calculate the correct row in the array for the current process
row_in_array = counter + local_rows_start
matrix.setValues(
i, range(global_cols), input_array[row_in_array, :], addv=False
)

# Assembly the matrix to compute the final structure
matrix.assemblyBegin()
matrix.assemblyEnd()

return matrix

# 
# EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc matrix Phi
#  A' = Phi.T * A * Phi
# [k x k] <- [k x m] x [m x m] x [m x k]
# 

m, k = 100, 7
# Generate the random numpy matrices
np.random.seed(0)  # sets the seed to 0
A_np = np.random.randint(low=0, high=6, size=(m, m))
Phi_np = np.random.randint(low=0, high=6, size=(m, k))

# 
# TEST: Galerking projection of numpy matrices A_np and Phi_np
# 
Aprime_np = Phi_np.T @ A_np @ Phi_np
Print(f"MATRIX Aprime_np [{Aprime_np.shape[0]}x{Aprime_np.shape[1]}]")
Print(f"{Aprime_np}")

# Create A as an mpi matrix distributed on each process
A = create_petsc_matrix(A_np, sparse=False)

# Create Phi as an mpi matrix distributed on each process
Phi = create_petsc_matrix(Phi_np, sparse=False)

# Create an empty PETSc matrix object to store the result of the PtAP operation.
# This will hold the result A' = Phi.T * A * Phi after the computation.
A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=False)

# Perform the PtAP (Phi Transpose times A times Phi) operation.
# In mathematical terms, this operation is A' = Phi.T * A * Phi.
# A_prime will store the result of the operation.
A_prime = A.ptap(Phi)

Here is the error

MATRIX mpiaij A [100x100]
Assembled

Partitioning for A:
  Rank 0: Rows [0, 34)
  Rank 1: Rows [34, 67)
  Rank 2: Rows [67, 100)

MATRIX mpiaij Phi [100x7]
Assembled

Partitioning for Phi:
  Rank 0: Rows [0, 34)
  Rank 1: Rows [34, 67)
  Rank 2: Rows [67, 100)

Traceback (most recent call last):
  File "/Users/boutsitron/work/galerkin_projection.py", line 87, in 
A_prime = A.ptap(Phi)
  ^^^
  File "petsc4py/PETSc/Mat.pyx", line 1525, in petsc4py.PETSc.Mat.ptap
petsc4py.PETSc.Error: error code 60
[0] MatPtAP() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9896
[0] MatProductSetFromOptions() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:541
[0] MatProductSetFromOptions_Private() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/interface/matproduct.c:435
[0] MatProductSetFromOptions_MPIAIJ() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/impls/aij/mpi/mpimatmatmult.c:2372
[0] MatProductSetFromOptions_MPIAIJ_PtAP() at 
/Users/boutsitron/firedrake/src/petsc/src/mat/impls/aij/mpi/mpimatmatmult.c:2266
[0] Nonconforming object sizes
[0] Matrix local dimensions are incompatible, Acol (0, 100) != Prow (0,34)
Abort(1) on node 0 (rank 0 in comm 496): application called 
MPI_Abort(PYOP2_COMM_WORLD, 1) - process 0

Any thoughts?

Thanks,
Thanos

> On 5 Oct 2023, at 14:23, Thanasis Boutsikakis 
>  wrote:
> 
> This works Pierre. Amazing input, thanks a lot!
> 
>> On 5 Oct 2023, at 14:17, Pierre Jolivet  wrote:
>> 
>> Not a petsc4py expert here, but you may to try instead:
>> A_prime = A.ptap(Phi)
>> 

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-05 Thread Thanasis Boutsikakis
This works Pierre. Amazing input, thanks a lot!

> On 5 Oct 2023, at 14:17, Pierre Jolivet  wrote:
> 
> Not a petsc4py expert here, but you may to try instead:
> A_prime = A.ptap(Phi)
> 
> Thanks,
> Pierre
> 
>> On 5 Oct 2023, at 2:02 PM, Thanasis Boutsikakis 
>>  wrote:
>> 
>> Thanks Pierre! So I tried this and got a segmentation fault. Is this 
>> supposed to work right off the bat or am I missing sth?
>> 
>> [0]PETSC ERROR: 
>> 
>> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
>> probably memory access out of range
>> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>> [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and 
>> https://petsc.org/release/faq/
>> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and 
>> run
>> [0]PETSC ERROR: to get more information on the crash.
>> [0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is 
>> causing the crash.
>> Abort(59) on node 0 (rank 0 in comm 0): application called 
>> MPI_Abort(MPI_COMM_WORLD, 59) - process 0
>> 
>> """Experimenting with PETSc mat-mat multiplication"""
>> 
>> import time
>> 
>> import numpy as np
>> from colorama import Fore
>> from firedrake import COMM_SELF, COMM_WORLD
>> from firedrake.petsc import PETSc
>> from mpi4py import MPI
>> from numpy.testing import assert_array_almost_equal
>> 
>> from utilities import (
>> Print,
>> create_petsc_matrix,
>> print_matrix_partitioning,
>> )
>> 
>> nproc = COMM_WORLD.size
>> rank = COMM_WORLD.rank
>> 
>> # 
>> # EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc matrix 
>> Phi
>> #  A' = Phi.T * A * Phi
>> # [k x k] <- [k x m] x [m x m] x [m x k]
>> # 
>> 
>> m, k = 11, 7
>> # Generate the random numpy matrices
>> np.random.seed(0)  # sets the seed to 0
>> A_np = np.random.randint(low=0, high=6, size=(m, m))
>> Phi_np = np.random.randint(low=0, high=6, size=(m, k))
>> 
>> # 
>> # TEST: Galerking projection of numpy matrices A_np and Phi_np
>> # 
>> Aprime_np = Phi_np.T @ A_np @ Phi_np
>> Print(f"MATRIX Aprime_np [{Aprime_np.shape[0]}x{Aprime_np.shape[1]}]")
>> Print(f"{Aprime_np}")
>> 
>> # Create A as an mpi matrix distributed on each process
>> A = create_petsc_matrix(A_np, sparse=False)
>> 
>> # Create Phi as an mpi matrix distributed on each process
>> Phi = create_petsc_matrix(Phi_np, sparse=False)
>> 
>> # Create an empty PETSc matrix object to store the result of the PtAP 
>> operation.
>> # This will hold the result A' = Phi.T * A * Phi after the computation.
>> A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=False)
>> 
>> # Perform the PtAP (Phi Transpose times A times Phi) operation.
>> # In mathematical terms, this operation is A' = Phi.T * A * Phi.
>> # A_prime will store the result of the operation.
>> Phi.PtAP(A, A_prime)
>> 
>>> On 5 Oct 2023, at 13:22, Pierre Jolivet  wrote:
>>> 
>>> How about using ptap which will use MatPtAP?
>>> It will be more efficient (and it will help you bypass the issue).
>>> 
>>> Thanks,
>>> Pierre
>>> 
 On 5 Oct 2023, at 1:18 PM, Thanasis Boutsikakis 
  wrote:
 
 Sorry, forgot function create_petsc_matrix()
 
 def create_petsc_matrix(input_array sparse=True):
 """Create a PETSc matrix from an input_array
 
 Args:
 input_array (np array): Input array
 partition_like (PETSc mat, optional): Petsc matrix. Defaults to 
 None.
 sparse (bool, optional): Toggle for sparese or dense. Defaults to 
 True.
 
 Returns:
 PETSc mat: PETSc matrix
 """
 # Check if input_array is 1D and reshape if necessary
 assert len(input_array.shape) == 2, "Input array should be 
 2-dimensional"
 global_rows, global_cols = input_array.shape
 
 size = ((None, global_rows), (global_cols, global_cols))
 
 # Create a sparse or dense matrix based on the 'sparse' argument
 if sparse:
 matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
 else:
 matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
 matrix.setUp()
 
 local_rows_start, local_rows_end = matrix.getOwnershipRange()
 
 for counter, i in enumerate(range(local_rows_start, local_rows_end)):
 # Calculate the correct row in the array for the current process
 row_in_array = counter + local_rows_start
 matrix.setValues(
 i, range(global_cols), input_array[row_in_array, :], addv=False
 )
 
 # Assembly the matrix to compute the final structure
 matrix.assemblyBegin()
 matrix.assemblyEnd()
 
 return 

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-05 Thread Pierre Jolivet
Not a petsc4py expert here, but you may to try instead:
A_prime = A.ptap(Phi)

Thanks,
Pierre

> On 5 Oct 2023, at 2:02 PM, Thanasis Boutsikakis 
>  wrote:
> 
> Thanks Pierre! So I tried this and got a segmentation fault. Is this supposed 
> to work right off the bat or am I missing sth?
> 
> [0]PETSC ERROR: 
> 
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
> probably memory access out of range
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> [0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and 
> https://petsc.org/release/faq/
> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
> [0]PETSC ERROR: to get more information on the crash.
> [0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is 
> causing the crash.
> Abort(59) on node 0 (rank 0 in comm 0): application called 
> MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> 
> """Experimenting with PETSc mat-mat multiplication"""
> 
> import time
> 
> import numpy as np
> from colorama import Fore
> from firedrake import COMM_SELF, COMM_WORLD
> from firedrake.petsc import PETSc
> from mpi4py import MPI
> from numpy.testing import assert_array_almost_equal
> 
> from utilities import (
> Print,
> create_petsc_matrix,
> print_matrix_partitioning,
> )
> 
> nproc = COMM_WORLD.size
> rank = COMM_WORLD.rank
> 
> # 
> # EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc matrix 
> Phi
> #  A' = Phi.T * A * Phi
> # [k x k] <- [k x m] x [m x m] x [m x k]
> # 
> 
> m, k = 11, 7
> # Generate the random numpy matrices
> np.random.seed(0)  # sets the seed to 0
> A_np = np.random.randint(low=0, high=6, size=(m, m))
> Phi_np = np.random.randint(low=0, high=6, size=(m, k))
> 
> # 
> # TEST: Galerking projection of numpy matrices A_np and Phi_np
> # 
> Aprime_np = Phi_np.T @ A_np @ Phi_np
> Print(f"MATRIX Aprime_np [{Aprime_np.shape[0]}x{Aprime_np.shape[1]}]")
> Print(f"{Aprime_np}")
> 
> # Create A as an mpi matrix distributed on each process
> A = create_petsc_matrix(A_np, sparse=False)
> 
> # Create Phi as an mpi matrix distributed on each process
> Phi = create_petsc_matrix(Phi_np, sparse=False)
> 
> # Create an empty PETSc matrix object to store the result of the PtAP 
> operation.
> # This will hold the result A' = Phi.T * A * Phi after the computation.
> A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=False)
> 
> # Perform the PtAP (Phi Transpose times A times Phi) operation.
> # In mathematical terms, this operation is A' = Phi.T * A * Phi.
> # A_prime will store the result of the operation.
> Phi.PtAP(A, A_prime)
> 
>> On 5 Oct 2023, at 13:22, Pierre Jolivet  wrote:
>> 
>> How about using ptap which will use MatPtAP?
>> It will be more efficient (and it will help you bypass the issue).
>> 
>> Thanks,
>> Pierre
>> 
>>> On 5 Oct 2023, at 1:18 PM, Thanasis Boutsikakis 
>>>  wrote:
>>> 
>>> Sorry, forgot function create_petsc_matrix()
>>> 
>>> def create_petsc_matrix(input_array sparse=True):
>>> """Create a PETSc matrix from an input_array
>>> 
>>> Args:
>>> input_array (np array): Input array
>>> partition_like (PETSc mat, optional): Petsc matrix. Defaults to 
>>> None.
>>> sparse (bool, optional): Toggle for sparese or dense. Defaults to 
>>> True.
>>> 
>>> Returns:
>>> PETSc mat: PETSc matrix
>>> """
>>> # Check if input_array is 1D and reshape if necessary
>>> assert len(input_array.shape) == 2, "Input array should be 
>>> 2-dimensional"
>>> global_rows, global_cols = input_array.shape
>>> 
>>> size = ((None, global_rows), (global_cols, global_cols))
>>> 
>>> # Create a sparse or dense matrix based on the 'sparse' argument
>>> if sparse:
>>> matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
>>> else:
>>> matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
>>> matrix.setUp()
>>> 
>>> local_rows_start, local_rows_end = matrix.getOwnershipRange()
>>> 
>>> for counter, i in enumerate(range(local_rows_start, local_rows_end)):
>>> # Calculate the correct row in the array for the current process
>>> row_in_array = counter + local_rows_start
>>> matrix.setValues(
>>> i, range(global_cols), input_array[row_in_array, :], addv=False
>>> )
>>> 
>>> # Assembly the matrix to compute the final structure
>>> matrix.assemblyBegin()
>>> matrix.assemblyEnd()
>>> 
>>> return matrix
>>> 
 On 5 Oct 2023, at 13:09, Thanasis Boutsikakis 
  wrote:
 
 Hi everyone,
 
 I am trying a Galerkin projection (see MFE below) and I cannot get the 
 Phi.transposeMatMult(A, A1) work. The error is
 
   

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-05 Thread Thanasis Boutsikakis
Thanks Pierre! So I tried this and got a segmentation fault. Is this supposed 
to work right off the bat or am I missing sth?

[0]PETSC ERROR: 

[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably 
memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see https://petsc.org/release/faq/#valgrind and 
https://petsc.org/release/faq/
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is causing 
the crash.
Abort(59) on node 0 (rank 0 in comm 0): application called 
MPI_Abort(MPI_COMM_WORLD, 59) - process 0

"""Experimenting with PETSc mat-mat multiplication"""

import time

import numpy as np
from colorama import Fore
from firedrake import COMM_SELF, COMM_WORLD
from firedrake.petsc import PETSc
from mpi4py import MPI
from numpy.testing import assert_array_almost_equal

from utilities import (
Print,
create_petsc_matrix,
print_matrix_partitioning,
)

nproc = COMM_WORLD.size
rank = COMM_WORLD.rank

# 
# EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc matrix Phi
#  A' = Phi.T * A * Phi
# [k x k] <- [k x m] x [m x m] x [m x k]
# 

m, k = 11, 7
# Generate the random numpy matrices
np.random.seed(0)  # sets the seed to 0
A_np = np.random.randint(low=0, high=6, size=(m, m))
Phi_np = np.random.randint(low=0, high=6, size=(m, k))

# 
# TEST: Galerking projection of numpy matrices A_np and Phi_np
# 
Aprime_np = Phi_np.T @ A_np @ Phi_np
Print(f"MATRIX Aprime_np [{Aprime_np.shape[0]}x{Aprime_np.shape[1]}]")
Print(f"{Aprime_np}")

# Create A as an mpi matrix distributed on each process
A = create_petsc_matrix(A_np, sparse=False)

# Create Phi as an mpi matrix distributed on each process
Phi = create_petsc_matrix(Phi_np, sparse=False)

# Create an empty PETSc matrix object to store the result of the PtAP operation.
# This will hold the result A' = Phi.T * A * Phi after the computation.
A_prime = create_petsc_matrix(np.zeros((k, k)), sparse=False)

# Perform the PtAP (Phi Transpose times A times Phi) operation.
# In mathematical terms, this operation is A' = Phi.T * A * Phi.
# A_prime will store the result of the operation.
Phi.PtAP(A, A_prime)

> On 5 Oct 2023, at 13:22, Pierre Jolivet  wrote:
> 
> How about using ptap which will use MatPtAP?
> It will be more efficient (and it will help you bypass the issue).
> 
> Thanks,
> Pierre
> 
>> On 5 Oct 2023, at 1:18 PM, Thanasis Boutsikakis 
>>  wrote:
>> 
>> Sorry, forgot function create_petsc_matrix()
>> 
>> def create_petsc_matrix(input_array sparse=True):
>> """Create a PETSc matrix from an input_array
>> 
>> Args:
>> input_array (np array): Input array
>> partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
>> sparse (bool, optional): Toggle for sparese or dense. Defaults to 
>> True.
>> 
>> Returns:
>> PETSc mat: PETSc matrix
>> """
>> # Check if input_array is 1D and reshape if necessary
>> assert len(input_array.shape) == 2, "Input array should be 2-dimensional"
>> global_rows, global_cols = input_array.shape
>> 
>> size = ((None, global_rows), (global_cols, global_cols))
>> 
>> # Create a sparse or dense matrix based on the 'sparse' argument
>> if sparse:
>> matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
>> else:
>> matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
>> matrix.setUp()
>> 
>> local_rows_start, local_rows_end = matrix.getOwnershipRange()
>> 
>> for counter, i in enumerate(range(local_rows_start, local_rows_end)):
>> # Calculate the correct row in the array for the current process
>> row_in_array = counter + local_rows_start
>> matrix.setValues(
>> i, range(global_cols), input_array[row_in_array, :], addv=False
>> )
>> 
>> # Assembly the matrix to compute the final structure
>> matrix.assemblyBegin()
>> matrix.assemblyEnd()
>> 
>> return matrix
>> 
>>> On 5 Oct 2023, at 13:09, Thanasis Boutsikakis 
>>>  wrote:
>>> 
>>> Hi everyone,
>>> 
>>> I am trying a Galerkin projection (see MFE below) and I cannot get the 
>>> Phi.transposeMatMult(A, A1) work. The error is
>>> 
>>> Phi.transposeMatMult(A, A1)
>>>   File "petsc4py/PETSc/Mat.pyx", line 1514, in 
>>> petsc4py.PETSc.Mat.transposeMatMult
>>> petsc4py.PETSc.Error: error code 56
>>> [0] MatTransposeMatMult() at 
>>> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:10135
>>> [0] MatProduct_Private() at 
>>> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9989

Re: [petsc-users] Galerkin projection using petsc4py

2023-10-05 Thread Pierre Jolivet
How about using ptap which will use MatPtAP?
It will be more efficient (and it will help you bypass the issue).

Thanks,
Pierre

> On 5 Oct 2023, at 1:18 PM, Thanasis Boutsikakis 
>  wrote:
> 
> Sorry, forgot function create_petsc_matrix()
> 
> def create_petsc_matrix(input_array sparse=True):
> """Create a PETSc matrix from an input_array
> 
> Args:
> input_array (np array): Input array
> partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
> sparse (bool, optional): Toggle for sparese or dense. Defaults to 
> True.
> 
> Returns:
> PETSc mat: PETSc matrix
> """
> # Check if input_array is 1D and reshape if necessary
> assert len(input_array.shape) == 2, "Input array should be 2-dimensional"
> global_rows, global_cols = input_array.shape
> 
> size = ((None, global_rows), (global_cols, global_cols))
> 
> # Create a sparse or dense matrix based on the 'sparse' argument
> if sparse:
> matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
> else:
> matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
> matrix.setUp()
> 
> local_rows_start, local_rows_end = matrix.getOwnershipRange()
> 
> for counter, i in enumerate(range(local_rows_start, local_rows_end)):
> # Calculate the correct row in the array for the current process
> row_in_array = counter + local_rows_start
> matrix.setValues(
> i, range(global_cols), input_array[row_in_array, :], addv=False
> )
> 
> # Assembly the matrix to compute the final structure
> matrix.assemblyBegin()
> matrix.assemblyEnd()
> 
> return matrix
> 
>> On 5 Oct 2023, at 13:09, Thanasis Boutsikakis 
>>  wrote:
>> 
>> Hi everyone,
>> 
>> I am trying a Galerkin projection (see MFE below) and I cannot get the 
>> Phi.transposeMatMult(A, A1) work. The error is
>> 
>> Phi.transposeMatMult(A, A1)
>>   File "petsc4py/PETSc/Mat.pyx", line 1514, in 
>> petsc4py.PETSc.Mat.transposeMatMult
>> petsc4py.PETSc.Error: error code 56
>> [0] MatTransposeMatMult() at 
>> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:10135
>> [0] MatProduct_Private() at 
>> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9989
>> [0] No support for this operation for this object type
>> [0] Call MatProductCreate() first
>> 
>> Do you know if these exposed to petsc4py or maybe there is another way? I 
>> cannot get the MFE to work (neither in sequential nor in parallel)
>> 
>> """Experimenting with PETSc mat-mat multiplication"""
>> 
>> import time
>> 
>> import numpy as np
>> from colorama import Fore
>> from firedrake import COMM_SELF, COMM_WORLD
>> from firedrake.petsc import PETSc
>> from mpi4py import MPI
>> from numpy.testing import assert_array_almost_equal
>> 
>> from utilities import (
>> Print,
>> create_petsc_matrix,
>> )
>> 
>> nproc = COMM_WORLD.size
>> rank = COMM_WORLD.rank
>> 
>> # 
>> # EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc matrix 
>> Phi
>> #  A' = Phi.T * A * Phi
>> # [k x k] <- [k x m] x [m x m] x [m x k]
>> # 
>> 
>> m, k = 11, 7
>> # Generate the random numpy matrices
>> np.random.seed(0)  # sets the seed to 0
>> A_np = np.random.randint(low=0, high=6, size=(m, m))
>> Phi_np = np.random.randint(low=0, high=6, size=(m, k))
>> 
>> # Create A as an mpi matrix distributed on each process
>> A = create_petsc_matrix(A_np)
>> 
>> # Create Phi as an mpi matrix distributed on each process
>> Phi = create_petsc_matrix(Phi_np)
>> 
>> A1 = create_petsc_matrix(np.zeros((k, m)))
>> 
>> # Now A1 contains the result of Phi^T * A
>> Phi.transposeMatMult(A, A1)
>> 
> 



Re: [petsc-users] Galerkin projection using petsc4py

2023-10-05 Thread Thanasis Boutsikakis
Sorry, forgot function create_petsc_matrix()

def create_petsc_matrix(input_array sparse=True):
"""Create a PETSc matrix from an input_array

Args:
input_array (np array): Input array
partition_like (PETSc mat, optional): Petsc matrix. Defaults to None.
sparse (bool, optional): Toggle for sparese or dense. Defaults to True.

Returns:
PETSc mat: PETSc matrix
"""
# Check if input_array is 1D and reshape if necessary
assert len(input_array.shape) == 2, "Input array should be 2-dimensional"
global_rows, global_cols = input_array.shape

size = ((None, global_rows), (global_cols, global_cols))

# Create a sparse or dense matrix based on the 'sparse' argument
if sparse:
matrix = PETSc.Mat().createAIJ(size=size, comm=COMM_WORLD)
else:
matrix = PETSc.Mat().createDense(size=size, comm=COMM_WORLD)
matrix.setUp()

local_rows_start, local_rows_end = matrix.getOwnershipRange()

for counter, i in enumerate(range(local_rows_start, local_rows_end)):
# Calculate the correct row in the array for the current process
row_in_array = counter + local_rows_start
matrix.setValues(
i, range(global_cols), input_array[row_in_array, :], addv=False
)

# Assembly the matrix to compute the final structure
matrix.assemblyBegin()
matrix.assemblyEnd()

return matrix

> On 5 Oct 2023, at 13:09, Thanasis Boutsikakis 
>  wrote:
> 
> Hi everyone,
> 
> I am trying a Galerkin projection (see MFE below) and I cannot get the 
> Phi.transposeMatMult(A, A1) work. The error is
> 
> Phi.transposeMatMult(A, A1)
>   File "petsc4py/PETSc/Mat.pyx", line 1514, in 
> petsc4py.PETSc.Mat.transposeMatMult
> petsc4py.PETSc.Error: error code 56
> [0] MatTransposeMatMult() at 
> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:10135
> [0] MatProduct_Private() at 
> /Users/boutsitron/firedrake/src/petsc/src/mat/interface/matrix.c:9989
> [0] No support for this operation for this object type
> [0] Call MatProductCreate() first
> 
> Do you know if these exposed to petsc4py or maybe there is another way? I 
> cannot get the MFE to work (neither in sequential nor in parallel)
> 
> """Experimenting with PETSc mat-mat multiplication"""
> 
> import time
> 
> import numpy as np
> from colorama import Fore
> from firedrake import COMM_SELF, COMM_WORLD
> from firedrake.petsc import PETSc
> from mpi4py import MPI
> from numpy.testing import assert_array_almost_equal
> 
> from utilities import (
> Print,
> create_petsc_matrix,
> )
> 
> nproc = COMM_WORLD.size
> rank = COMM_WORLD.rank
> 
> # 
> # EXP: Galerkin projection of an mpi PETSc matrix A with an mpi PETSc matrix 
> Phi
> #  A' = Phi.T * A * Phi
> # [k x k] <- [k x m] x [m x m] x [m x k]
> # 
> 
> m, k = 11, 7
> # Generate the random numpy matrices
> np.random.seed(0)  # sets the seed to 0
> A_np = np.random.randint(low=0, high=6, size=(m, m))
> Phi_np = np.random.randint(low=0, high=6, size=(m, k))
> 
> # Create A as an mpi matrix distributed on each process
> A = create_petsc_matrix(A_np)
> 
> # Create Phi as an mpi matrix distributed on each process
> Phi = create_petsc_matrix(Phi_np)
> 
> A1 = create_petsc_matrix(np.zeros((k, m)))
> 
> # Now A1 contains the result of Phi^T * A
> Phi.transposeMatMult(A, A1)
>