Re: [petsc-users] Periodic boundary conditions using swarm

2024-04-21 Thread MIGUEL MOLINOS PEREZ
Thank you Matt, I will definitely do that. Thrilled to add a humble 
contribution to PETSc :-)

Best,
Miguel

On 21 Apr 2024, at 16:01, Matthew Knepley  wrote:

On Sun, Apr 21, 2024 at 9:38 AM MIGUEL MOLINOS PEREZ 
mailto:mmoli...@us.es>> wrote:
Dear Matt,

Thank you for your answer. In addition to your suggestion I solved a bug in the 
test (I was not updating the local size integer during the time loop). Indeed 
if I turn off periodicity it works. Furthermore, if I use instead 
DM_BOUNDARY_TWIST instead, it works too.

However, if I turn on DM_BOUNDARY_PERIODIC, I got an error in the search 
algorithm I implemented for the domain decomposition inspired by 
(https://urldefense.us/v3/__https://petsc.org/main/src/dm/tutorials/swarm_ex3.c.html__;!!G_uCfscf7eWS!djMj0jtH27HHR56cKhjzc0Kd4_HcjA62WBT1w_w1KFs0TfJQOtFQLA2AexxL7g4rS8aelbmYYFAE9n3wQLSIBA$
 ). The algorithm is not capable of finding some of the particles at the 
initial stage of the simulation (without transport).

Looks like the error is on my end, however it is puzzling why it works for 
DM_BOUNDARY_TWIST but not for DM_BOUNDARY_PERIODIC.

I usually solve these things by making a simple example. We could make another 
test in Swarm test ex1 that uses periodicity. If you send a draft over that 
fails, I can help you debug it. It would make a fantastic contribution to PETSc.

   Thanks,

 Matt

Thanks,
Miguel

On 21 Apr 2024, at 14:53, Matthew Knepley 
mailto:knep...@gmail.com>> wrote:

On Thu, Apr 18, 2024 at 8:23 AM MIGUEL MOLINOS PEREZ 
mailto:mmoli...@us.es>> wrote:
 Dear all, I am working on the implementation of periodic bcc using a 
discretisation (PIC-style). I am working with a test case which consists on 
solving the advection of a set of particles inside of a box (DMDA mesh) with 
periodic bcc on
ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.

ZjQcmQRYFpfptBannerEnd

Dear all,

I am working on the implementation of periodic bcc using a discretisation 
(PIC-style). I am working with a test case which consists on solving the 
advection of a set of particles inside of a box (DMDA mesh) with periodic bcc 
on the x axis.

My implementation updates the position of each particle with a velocity field, 
afterwards I check if the particle is inside, or not, the supercell (periodic 
box). If not, I correct the position using bcc conditions. Once this step is 
done, I call Dmswarmmigrate.

It works in serial, but crashes in parallel with MPI (see attached nohup file). 
I have checked some of the Dmswarmmigrate examples, and they looks similar to 
my implementation. However they do not use periodic bcc.

I am missing any step in addition to Dmswarmmigrate?

It does not sound like it. We do have parallel examples of periodic migration, 
such as Swarm ex9.

What happens if you turn off periodicity and just let particles fall out of the 
box? Does it still crash?

  Thanks,

Matt

Best regards
Miguel



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!djMj0jtH27HHR56cKhjzc0Kd4_HcjA62WBT1w_w1KFs0TfJQOtFQLA2AexxL7g4rS8aelbmYYFAE9n32M1rp2g$
 
<https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!djMj0jtH27HHR56cKhjzc0Kd4_HcjA62WBT1w_w1KFs0TfJQOtFQLA2AexxL7g4rS8aelbmYYFAE9n05B_V0QA$
 >



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!djMj0jtH27HHR56cKhjzc0Kd4_HcjA62WBT1w_w1KFs0TfJQOtFQLA2AexxL7g4rS8aelbmYYFAE9n32M1rp2g$
 
<https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!djMj0jtH27HHR56cKhjzc0Kd4_HcjA62WBT1w_w1KFs0TfJQOtFQLA2AexxL7g4rS8aelbmYYFAE9n05B_V0QA$
 >



Re: [petsc-users] Periodic boundary conditions using swarm

2024-04-21 Thread MIGUEL MOLINOS PEREZ
Dear Matt,

Thank you for your answer. In addition to your suggestion I solved a bug in the 
test (I was not updating the local size integer during the time loop). Indeed 
if I turn off periodicity it works. Furthermore, if I use instead 
DM_BOUNDARY_TWIST instead, it works too.

However, if I turn on DM_BOUNDARY_PERIODIC, I got an error in the search 
algorithm I implemented for the domain decomposition inspired by 
(https://urldefense.us/v3/__https://petsc.org/main/src/dm/tutorials/swarm_ex3.c.html__;!!G_uCfscf7eWS!YAbzai9IC1I94u8JUesDCmTh_xrCY1-UQe7ADOjgXknpsx0yj16i_XDanMUZrR185We2lV8noNubEDSr4sVJ9Q$
 ). The algorithm is not capable of finding some of the particles at the 
initial stage of the simulation (without transport).

Looks like the error is on my end, however it is puzzling why it works for 
DM_BOUNDARY_TWIST but not for DM_BOUNDARY_PERIODIC.

Thanks,
Miguel

On 21 Apr 2024, at 14:53, Matthew Knepley  wrote:

On Thu, Apr 18, 2024 at 8:23 AM MIGUEL MOLINOS PEREZ 
mailto:mmoli...@us.es>> wrote:
 Dear all, I am working on the implementation of periodic bcc using a 
discretisation (PIC-style). I am working with a test case which consists on 
solving the advection of a set of particles inside of a box (DMDA mesh) with 
periodic bcc on
ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.

ZjQcmQRYFpfptBannerEnd

Dear all,

I am working on the implementation of periodic bcc using a discretisation 
(PIC-style). I am working with a test case which consists on solving the 
advection of a set of particles inside of a box (DMDA mesh) with periodic bcc 
on the x axis.

My implementation updates the position of each particle with a velocity field, 
afterwards I check if the particle is inside, or not, the supercell (periodic 
box). If not, I correct the position using bcc conditions. Once this step is 
done, I call Dmswarmmigrate.

It works in serial, but crashes in parallel with MPI (see attached nohup file). 
I have checked some of the Dmswarmmigrate examples, and they looks similar to 
my implementation. However they do not use periodic bcc.

I am missing any step in addition to Dmswarmmigrate?

It does not sound like it. We do have parallel examples of periodic migration, 
such as Swarm ex9.

What happens if you turn off periodicity and just let particles fall out of the 
box? Does it still crash?

  Thanks,

Matt

Best regards
Miguel



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YAbzai9IC1I94u8JUesDCmTh_xrCY1-UQe7ADOjgXknpsx0yj16i_XDanMUZrR185We2lV8noNubEDQfcSCPxg$
 
<https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!YAbzai9IC1I94u8JUesDCmTh_xrCY1-UQe7ADOjgXknpsx0yj16i_XDanMUZrR185We2lV8noNubEDT6_Wemtg$
 >



[petsc-users] Periodic boundary conditions using swarm

2024-04-18 Thread MIGUEL MOLINOS PEREZ

Dear all,

I am working on the implementation of periodic bcc using a discretisation 
(PIC-style). I am working with a test case which consists on solving the 
advection of a set of particles inside of a box (DMDA mesh) with periodic bcc 
on the x axis.

My implementation updates the position of each particle with a velocity field, 
afterwards I check if the particle is inside, or not, the supercell (periodic 
box). If not, I correct the position using bcc conditions. Once this step is 
done, I call Dmswarmmigrate.

It works in serial, but crashes in parallel with MPI (see attached nohup file). 
I have checked some of the Dmswarmmigrate examples, and they looks similar to 
my implementation. However they do not use periodic bcc.

I am missing any step in addition to Dmswarmmigrate?

Best regards
Miguel



nohup.out
Description: nohup.out


Re: [petsc-users] Error loading data coming from a .hdf5 file into a DMSwarm

2024-04-01 Thread MIGUEL MOLINOS PEREZ
I see. Thank you for the feedback Matthew!

Thanks,
Miguel

On 1 Apr 2024, at 18:32, Matthew Knepley  wrote:

On Mon, Apr 1, 2024 at 11:13 AM MIGUEL MOLINOS PEREZ 
mailto:mmoli...@us.es>> wrote:
Dear Matthew,

Thank you for your suggestion. I tried to update the vector with the 
information coming from the hdf5 file inside the main function. Then I print 
the vector two times (see the lines below), the first time it has the correct 
data. However, the second time, it has the same values like I never updated it 
with VecLoad.

It is an alternative way of initialise a vector coming from DMSWarm with 
previously stored information (keeping the parallel structure)?

Oh, you cannot use the Swarm vectors for VecLoad(). They are just a view into 
the particle data, and that view is destroyed on restore. Swarm data is stored 
in a particle-like data structure, not in Vecs. If you want to load this Vec, 
you have to duplicate exactly as you did. This interface is likely to change in 
the next year to make this problem go away.

  Thanks,

Matt

Miguel

//! Load Hdf5 viewer
PetscViewer viewer_hdf5;
REQUIRE_NOTHROW(PetscViewerHDF5Open(MPI_COMM_WORLD, Output_hdf5_file,
FILE_MODE_READ, _hdf5));
REQUIRE_NOTHROW(PetscViewerHDF5PushTimestepping(viewer_hdf5));

//! Load the vector and fill it with the information from the .hdf5 file
Vec stdv_q;
REQUIRE_NOTHROW(DMSwarmCreateGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

REQUIRE_NOTHROW(PetscViewerHDF5PushGroup(viewer_hdf5, "/particle_fields"));
REQUIRE_NOTHROW(VecLoad(stdv_q, viewer_hdf5));
REQUIRE_NOTHROW(VecView(stdv_q, PETSC_VIEWER_STDOUT_WORLD));
REQUIRE_NOTHROW(PetscViewerHDF5PopGroup(viewer_hdf5));

REQUIRE_NOTHROW(DMSwarmDestroyGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

//! Destoy HDF5 context
REQUIRE_NOTHROW(PetscViewerDestroy(_hdf5));

//! Load the vector again and print
REQUIRE_NOTHROW(DMSwarmCreateGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

REQUIRE_NOTHROW(VecView(stdv_q, PETSC_VIEWER_STDOUT_WORLD));

REQUIRE_NOTHROW(DMSwarmDestroyGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

Best,
Miguel

Miguel Molinos
Investigador postdoctoral
Juan de la Cierva
Dpto. Mecánica de Medios Continuos y Teoría de Estructuras - ETSI
Universidad de Sevilla
Camino de los descubrimientos, s/n
41092 Sevilla







https://urldefense.us/v3/__http://www.us.es__;!!G_uCfscf7eWS!Z_MoZ7Bj4eyIXKal80C357h4aO105lFAgZwG6a6xZwNKsKVTzZ4HfUspESkt1TKg2MOv5bbWwmDB8Gsd-nJ2ng$
 
<https://urldefense.us/v3/__http://www.us.es/__;!!G_uCfscf7eWS!Z_MoZ7Bj4eyIXKal80C357h4aO105lFAgZwG6a6xZwNKsKVTzZ4HfUspESkt1TKg2MOv5bbWwmDB8GvROZc7vQ$
 >
Este correo electrónico y, en su caso, cualquier fichero anexo al mismo, 
contiene información de carácter confidencial exclusivamente dirigida a su 
destinatario o destinatarios. Si no es UD. el destinatario del mensaje, le 
ruego lo destruya sin hacer copia digital o física, comunicando al emisor por 
esta misma vía la recepción del presente mensaje. Gracias




On 1 Apr 2024, at 16:28, Matthew Knepley 
mailto:knep...@gmail.com>> wrote:

>From the description, my guess is that this is pointer confusion. The vector 
>inside the function is different from the vector outside the function.




--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!Z_MoZ7Bj4eyIXKal80C357h4aO105lFAgZwG6a6xZwNKsKVTzZ4HfUspESkt1TKg2MOv5bbWwmDB8Gs0StP1_g$
 
<https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!Z_MoZ7Bj4eyIXKal80C357h4aO105lFAgZwG6a6xZwNKsKVTzZ4HfUspESkt1TKg2MOv5bbWwmDB8Gt75UYM5Q$
 >



Re: [petsc-users] Error loading data coming from a .hdf5 file into a DMSwarm

2024-04-01 Thread MIGUEL MOLINOS PEREZ
Dear Matthew,

I came up with a workaround for the problem. I duplicate the vector and use the 
duplicated copy to read the information from the hdf5 file. Then I swap both 
vectors and delete the copy. If I invoke VecView outside of the function, the 
value has been modified properly. However, this solution seems a little bit 
“ugly”. I share it just in case someone is facing a similar problem or it can 
help to understand what is going wrong with my previous implementation.

Best,
Miguel

Vec stdv_q;
PetscCall(DMSwarmCreateGlobalVectorFromField(Simulation->atomistic_data,
"stdv-q", _q));

Vec stdv_q_hdf5;
PetscCall(VecDuplicate(stdv_q, _q_hdf5));
PetscCall(PetscObjectSetName((PetscObject)stdv_q_hdf5,
"DMSwarmSharedField_stdv-q"));

PetscCall(VecLoad(stdv_q_hdf5, viewer_hdf5));

PetscCall(VecSwap(stdv_q, stdv_q_hdf5));
PetscCall(VecDestroy(_q_hdf5));

PetscCall(DMSwarmDestroyGlobalVectorFromField(Simulation->atomistic_data,
"stdv-q", _q));

On 1 Apr 2024, at 17:13, Miguel Molinos  wrote:

Dear Matthew,

Thank you for your suggestion. I tried to update the vector with the 
information coming from the hdf5 file inside the main function. Then I print 
the vector two times (see the lines below), the first time it has the correct 
data. However, the second time, it has the same values like I never updated it 
with VecLoad.

It is an alternative way of initialise a vector coming from DMSWarm with 
previously stored information (keeping the parallel structure)?

Miguel

//! Load Hdf5 viewer
PetscViewer viewer_hdf5;
REQUIRE_NOTHROW(PetscViewerHDF5Open(MPI_COMM_WORLD, Output_hdf5_file,
FILE_MODE_READ, _hdf5));
REQUIRE_NOTHROW(PetscViewerHDF5PushTimestepping(viewer_hdf5));

//! Load the vector and fill it with the information from the .hdf5 file
Vec stdv_q;
REQUIRE_NOTHROW(DMSwarmCreateGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

REQUIRE_NOTHROW(PetscViewerHDF5PushGroup(viewer_hdf5, "/particle_fields"));
REQUIRE_NOTHROW(VecLoad(stdv_q, viewer_hdf5));
REQUIRE_NOTHROW(VecView(stdv_q, PETSC_VIEWER_STDOUT_WORLD));
REQUIRE_NOTHROW(PetscViewerHDF5PopGroup(viewer_hdf5));

REQUIRE_NOTHROW(DMSwarmDestroyGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

//! Destoy HDF5 context
REQUIRE_NOTHROW(PetscViewerDestroy(_hdf5));

//! Load the vector again and print
REQUIRE_NOTHROW(DMSwarmCreateGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

REQUIRE_NOTHROW(VecView(stdv_q, PETSC_VIEWER_STDOUT_WORLD));

REQUIRE_NOTHROW(DMSwarmDestroyGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

Best,
Miguel

Miguel Molinos
Investigador postdoctoral
Juan de la Cierva
Dpto. Mecánica de Medios Continuos y Teoría de Estructuras - ETSI
Universidad de Sevilla
Camino de los descubrimientos, s/n
41092 Sevilla







https://urldefense.us/v3/__http://www.us.es__;!!G_uCfscf7eWS!dcgV4Ks7of5UCrBcwXoQzq1ZP06FAF8IaHAwnyo5mL5uiBMiKj2mxO7W6odOpn8LRgla6ehj-QCjHK2MbBf_1Q$
 
Este correo electrónico y, en su caso, cualquier fichero anexo al mismo, 
contiene información de carácter confidencial exclusivamente dirigida a su 
destinatario o destinatarios. Si no es UD. el destinatario del mensaje, le 
ruego lo destruya sin hacer copia digital o física, comunicando al emisor por 
esta misma vía la recepción del presente mensaje. Gracias




On 1 Apr 2024, at 16:28, Matthew Knepley  wrote:

>From the description, my guess is that this is pointer confusion. The vector 
>inside the function is different from the vector outside the function.





Re: [petsc-users] Error loading data coming from a .hdf5 file into a DMSwarm

2024-04-01 Thread MIGUEL MOLINOS PEREZ
Dear Matthew,

Thank you for your suggestion. I tried to update the vector with the 
information coming from the hdf5 file inside the main function. Then I print 
the vector two times (see the lines below), the first time it has the correct 
data. However, the second time, it has the same values like I never updated it 
with VecLoad.

It is an alternative way of initialise a vector coming from DMSWarm with 
previously stored information (keeping the parallel structure)?

Miguel

//! Load Hdf5 viewer
PetscViewer viewer_hdf5;
REQUIRE_NOTHROW(PetscViewerHDF5Open(MPI_COMM_WORLD, Output_hdf5_file,
FILE_MODE_READ, _hdf5));
REQUIRE_NOTHROW(PetscViewerHDF5PushTimestepping(viewer_hdf5));

//! Load the vector and fill it with the information from the .hdf5 file
Vec stdv_q;
REQUIRE_NOTHROW(DMSwarmCreateGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

REQUIRE_NOTHROW(PetscViewerHDF5PushGroup(viewer_hdf5, "/particle_fields"));
REQUIRE_NOTHROW(VecLoad(stdv_q, viewer_hdf5));
REQUIRE_NOTHROW(VecView(stdv_q, PETSC_VIEWER_STDOUT_WORLD));
REQUIRE_NOTHROW(PetscViewerHDF5PopGroup(viewer_hdf5));

REQUIRE_NOTHROW(DMSwarmDestroyGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

//! Destoy HDF5 context
REQUIRE_NOTHROW(PetscViewerDestroy(_hdf5));

//! Load the vector again and print
REQUIRE_NOTHROW(DMSwarmCreateGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

REQUIRE_NOTHROW(VecView(stdv_q, PETSC_VIEWER_STDOUT_WORLD));

REQUIRE_NOTHROW(DMSwarmDestroyGlobalVectorFromField(
Simulation.atomistic_data, "stdv-q", _q));

Best,
Miguel

Miguel Molinos
Investigador postdoctoral
Juan de la Cierva
Dpto. Mecánica de Medios Continuos y Teoría de Estructuras - ETSI
Universidad de Sevilla
Camino de los descubrimientos, s/n
41092 Sevilla


[us_logo.jpg]




https://urldefense.us/v3/__http://www.us.es__;!!G_uCfscf7eWS!cBA1ckF1ChOi-Bgd9vd_TbaJmDZbonuQUNk2GOtfDnlYlIyTsO0mIZDspS-H1SopQFNhXZr5udHSOKQPUNQ-Xw$
 
Este correo electrónico y, en su caso, cualquier fichero anexo al mismo, 
contiene información de carácter confidencial exclusivamente dirigida a su 
destinatario o destinatarios. Si no es UD. el destinatario del mensaje, le 
ruego lo destruya sin hacer copia digital o física, comunicando al emisor por 
esta misma vía la recepción del presente mensaje. Gracias




On 1 Apr 2024, at 16:28, Matthew Knepley  wrote:

>From the description, my guess is that this is pointer confusion. The vector 
>inside the function is different from the vector outside the function.




[petsc-users] Error loading data coming from a .hdf5 file into a DMSwarm

2024-03-31 Thread MIGUEL MOLINOS PEREZ
Dear all,

I am writing a function which store datasets (Vectors) coming from a DMSwarm 
structure into a hdf5 file. This step is done nicely

write_function(){
PetscViewerHDF5Open(…)
PetscViewerHDF5PushTimestepping(…)
DMSwarmCreateGlobalVectorFromField(…)
VecLoad(…)
DMSwarmDestroyGlobalVectorFromField(…)
}

The resulting hdf5 file looks good after an inspection using python’s library 
h5py.

However, I am finding difficulties when I try to use this .hdf5 file as a fresh 
start for my application. The target field is not properly updated when I try 
to load the stored data (it keeps the default one).

read_function(){
…
PetscViewerHDF5Open(…)
PetscViewerHDF5PushTimestepping(…)
DMSwarmCreateGlobalVectorFromField(…)
VecLoad(… )
DMSwarmDestroyGlobalVectorFromField(…)
...
}

The puzzling part is: if I print the “updated” vector inside of read_function() 
using VecView after VecLoad, the vector seem to hold the updated values. 
However, If I print the field in the main function after the call to 
read_function(), the field remains the same it was before calling to 
read_function() and I do not get any erro message.

It is there something wrong with the logic of my programing? Maybe I am missing 
something.

Thank you in advance.

Best regards,
Miguel

Miguel Molinos
Investigador postdoctoral
Juan de la Cierva
Dpto. Mecánica de Medios Continuos y Teoría de Estructuras - ETSI
Universidad de Sevilla
Camino de los descubrimientos, s/n
41092 Sevilla


[us_logo.jpg]





https://urldefense.us/v3/__http://www.us.es__;!!G_uCfscf7eWS!egnWYkerVjlSmbuh1nUjljGJ4vkEwKUvO9ygc9e5BJ72KxK-T4qbWIxTBbg-BeyxDvqN5Rr5EvaWMAIXee5w9w$
 
Este correo electrónico y, en su caso, cualquier fichero anexo al mismo, 
contiene información de carácter confidencial exclusivamente dirigida a su 
destinatario o destinatarios. Si no es UD. el destinatario del mensaje, le 
ruego lo destruya sin hacer copia digital o física, comunicando al emisor por 
esta misma vía la recepción del presente mensaje. Gracias



Re: [petsc-users] Ghost points in DMSwarm

2024-02-25 Thread MIGUEL MOLINOS PEREZ
I see, so I'm the one in charge of the synchronization between the particles 
and the replicas associated, right?

Best,
Miguel

On Feb 25, 2024, at 6:53 PM, Matthew Knepley  wrote:

On Sun, Feb 25, 2024 at 7:21 AM MIGUEL MOLINOS PEREZ 
mailto:mmoli...@us.es>> wrote:
Dear Matthew,

Thank you for answer me out of office hours and for the suggestion. If I 
interpret it correctly, your point is using the background mesh halo to 
identify and create ghosts particles in the overlapping region using 
DMSwarmMigrate().

Then two questions hits me:

  *   If I change a variable in the “real” particle (say the position), it 
should be updated in the replicas on the overlapped regions, right? This sort 
of communicator is included in DMSwarmMigrate()?

No, the particles that are migrated are copies. There is no current analogue of 
DMGlobalToLocal(). It would need to be constructed. It is possible, but would 
take writing some code. All applications we have supported to date do not use 
replicated particles.

  *   The function DMSwarmCreateLocalVectorFromField() does not include extra 
space for ghost particles. Therefore, I should resize the local size for each 
rank to accommodate the replicas manually. Right?

It uses all particles on the process. If some of those are replicas, they will 
be included.

  Thanks,

 Matt

Anyway, I think this this will take me more time I expected originally. 
Meanwhile,  I will go ahead with my “humble” parallelization scheme which 
consist in using AllReduce to create a serial vectors with the variables I need 
to evaluate rank-wise the residual equations of my problem  (please, don’t 
judge me, publish or perish...)

Many thanks,
Miguel


On Feb 24, 2024, at 4:24 PM, Matthew Knepley 
mailto:knep...@gmail.com>> wrote:

On Fri, Feb 23, 2024 at 5:14 PM MIGUEL MOLINOS PEREZ 
mailto:mmoli...@us.es>> wrote:
Dear all, I am struggling on how to include ghost points in DMSwarm local 
vectors. According to PETSc documentation it seems “straightforward” for a DMDA 
mesh. However, I am not so sure on how to do it for a DMSwarm. In fact, if I 
add the result
ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.

ZjQcmQRYFpfptBannerEnd
Dear all,

I am struggling on how to include ghost points in DMSwarm local vectors. 
According to PETSc documentation it seems “straightforward” for a DMDA mesh. 
However, I am not so sure on how to do it for a DMSwarm. In fact, if I add the 
result of DMSwarmGetLocalSize for each rank, the result is the exact number of 
particles in the system. Which means that the halo is zero.

Since there is a function called DMSwarmCreateLocalVectorFromField,I was 
wandering if there is a function already implemented in petsc (and I’m missing 
it) to include ghost points in DMSwarm and therefore don’t have to reinvent the 
wheel. If so, is there any example out there I can follow?

DMSwarm is currently different from the other DMs. There is no idea of identity 
for particles, so there is no idea of a shared particle. You can create this by 
adding an ID to your particle fields, but it would be a manual process. 
DMSwarmMigrate() can duplicate particles if you have an overlapping mesh, which 
would mimic shared particles.

  THanks,

 Matt

Thanks in advance.

Regards,
Miguel

<https://urldefense.us/v3/__https://petsc.org/release/manual/vec/__;!!G_uCfscf7eWS!f6U_z6q8Rk6XrOxvbmE8KyNErlmqYwpGzkxCQ56xX0agWaCG0tLVLh1Cml6fTtqvve0aL3HGiAhZn-hDgIoH5w$>
petsc.org<https://urldefense.us/v3/__https://petsc.org/release/manual/vec/__;!!G_uCfscf7eWS!f6U_z6q8Rk6XrOxvbmE8KyNErlmqYwpGzkxCQ56xX0agWaCG0tLVLh1Cml6fTtqvve0aL3HGiAhZn-hDgIoH5w$>
[X]<https://urldefense.us/v3/__https://petsc.org/release/manual/vec/__;!!G_uCfscf7eWS!f6U_z6q8Rk6XrOxvbmE8KyNErlmqYwpGzkxCQ56xX0agWaCG0tLVLh1Cml6fTtqvve0aL3HGiAhZn-hDgIoH5w$>




--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!dxinoV9sMfUwzVekCnVvZ-35vwYSpmBeJrsrizWNZxVg6Q_Vsoi_-fVMaxO8IdGAV8Jlgec_1ZeY42E-Phe_sQ$
 
<https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!dxinoV9sMfUwzVekCnVvZ-35vwYSpmBeJrsrizWNZxVg6Q_Vsoi_-fVMaxO8IdGAV8Jlgec_1ZeY42HmC-tTXQ$
 >



--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!dxinoV9sMfUwzVekCnVvZ-35vwYSpmBeJrsrizWNZxVg6Q_Vsoi_-fVMaxO8IdGAV8Jlgec_1ZeY42E-Phe_sQ$
 
<https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!dxinoV9sMfUwzVekCnVvZ-35vwYSpmBeJrsrizWNZxVg6Q_Vsoi_-fVMaxO8IdGAV8Jlgec_1ZeY42HmC-tTXQ$
 >



Re: [petsc-users] Ghost points in DMSwarm

2024-02-25 Thread MIGUEL MOLINOS PEREZ
Dear Matthew,

Thank you for answer me out of office hours and for the suggestion. If I 
interpret it correctly, your point is using the background mesh halo to 
identify and create ghosts particles in the overlapping region using 
DMSwarmMigrate().

Then two questions hits me:

  *   If I change a variable in the “real” particle (say the position), it 
should be updated in the replicas on the overlapped regions, right? This sort 
of communicator is included in DMSwarmMigrate()?
  *   The function DMSwarmCreateLocalVectorFromField() does not include extra 
space for ghost particles. Therefore, I should resize the local size for each 
rank to accommodate the replicas manually. Right?

Anyway, I think this this will take me more time I expected originally. 
Meanwhile,  I will go ahead with my “humble” parallelization scheme which 
consist in using AllReduce to create a serial vectors with the variables I need 
to evaluate rank-wise the residual equations of my problem  (please, don’t 
judge me, publish or perish...)

Many thanks,
Miguel


On Feb 24, 2024, at 4:24 PM, Matthew Knepley  wrote:

On Fri, Feb 23, 2024 at 5:14 PM MIGUEL MOLINOS PEREZ 
mailto:mmoli...@us.es>> wrote:
Dear all, I am struggling on how to include ghost points in DMSwarm local 
vectors. According to PETSc documentation it seems “straightforward” for a DMDA 
mesh. However, I am not so sure on how to do it for a DMSwarm. In fact, if I 
add the result
ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
This message came from outside your organization.

ZjQcmQRYFpfptBannerEnd
Dear all,

I am struggling on how to include ghost points in DMSwarm local vectors. 
According to PETSc documentation it seems “straightforward” for a DMDA mesh. 
However, I am not so sure on how to do it for a DMSwarm. In fact, if I add the 
result of DMSwarmGetLocalSize for each rank, the result is the exact number of 
particles in the system. Which means that the halo is zero.

Since there is a function called DMSwarmCreateLocalVectorFromField,I was 
wandering if there is a function already implemented in petsc (and I’m missing 
it) to include ghost points in DMSwarm and therefore don’t have to reinvent the 
wheel. If so, is there any example out there I can follow?

DMSwarm is currently different from the other DMs. There is no idea of identity 
for particles, so there is no idea of a shared particle. You can create this by 
adding an ID to your particle fields, but it would be a manual process. 
DMSwarmMigrate() can duplicate particles if you have an overlapping mesh, which 
would mimic shared particles.

  THanks,

 Matt

Thanks in advance.

Regards,
Miguel

<https://urldefense.us/v3/__https://petsc.org/release/manual/vec/__;!!G_uCfscf7eWS!f6U_z6q8Rk6XrOxvbmE8KyNErlmqYwpGzkxCQ56xX0agWaCG0tLVLh1Cml6fTtqvve0aL3HGiAhZn-hDgIoH5w$>
petsc.org<https://urldefense.us/v3/__https://petsc.org/release/manual/vec/__;!!G_uCfscf7eWS!f6U_z6q8Rk6XrOxvbmE8KyNErlmqYwpGzkxCQ56xX0agWaCG0tLVLh1Cml6fTtqvve0aL3HGiAhZn-hDgIoH5w$>
[X]<https://urldefense.us/v3/__https://petsc.org/release/manual/vec/__;!!G_uCfscf7eWS!f6U_z6q8Rk6XrOxvbmE8KyNErlmqYwpGzkxCQ56xX0agWaCG0tLVLh1Cml6fTtqvve0aL3HGiAhZn-hDgIoH5w$>




--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!cQcTxz-cUaVBpc8qjPR-ggaw3NXqq0-rf5v5xtsKzEmm4uvOkRcw4LffQ4DSJX8Py_hU0auibW7oRimePeKAZA$
 
<https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!cQcTxz-cUaVBpc8qjPR-ggaw3NXqq0-rf5v5xtsKzEmm4uvOkRcw4LffQ4DSJX8Py_hU0auibW7oRimQOwEsTg$
 >



[petsc-users] Ghost points in DMSwarm

2024-02-23 Thread MIGUEL MOLINOS PEREZ
Dear all,

I am struggling on how to include ghost points in DMSwarm local vectors. 
According to PETSc documentation it seems “straightforward” for a DMDA mesh. 
However, I am not so sure on how to do it for a DMSwarm. In fact, if I add the 
result of DMSwarmGetLocalSize for each rank, the result is the exact number of 
particles in the system. Which means that the halo is zero.

Since there is a function called DMSwarmCreateLocalVectorFromField,I was 
wandering if there is a function already implemented in petsc (and I’m missing 
it) to include ghost points in DMSwarm and therefore don’t have to reinvent the 
wheel. If so, is there any example out there I can follow?

Thanks in advance.

Regards,
Miguel


petsc.org
[X]




Re: [petsc-users] Domain decomposition in PETSc for Molecular Dynamics

2023-12-13 Thread MIGUEL MOLINOS PEREZ
Thank you for the feedback. Seems like the second option is the right one 
(according to MD literature).

Best,
Miguel

On 13 Dec 2023, at 01:43, Matthew Knepley  wrote:

On Tue, Dec 12, 2023 at 7:28 PM MIGUEL MOLINOS PEREZ 
mailto:mmoli...@us.es>> wrote:
I meant the list of atoms which lies inside of a sphere of radius R_cutoff 
centered at the mean position of a given atom.

Okay, this is possible in parallel, but would require hard work and I have not 
done it yet, although I think all the tools are coded.

In serial, there are at least two ways to do it. First, you can use a k-d tree 
implementation, since they usually have the radius query. I have not put one of 
these in, because I did not like any implementation, but Underworld and 
Firedrake have and it works fine. Second, you can choose a grid size of 
R_cutoff for the background grid, and then check neighbors. This is probably 
how I would start.

  Thanks,

 Matt

Best,
Miguel

On 13 Dec 2023, at 01:14, Matthew Knepley 
mailto:knep...@gmail.com>> wrote:


On Tue, Dec 12, 2023 at 2:36 PM MIGUEL MOLINOS PEREZ 
mailto:mmoli...@us.es>> wrote:
Dear Matthew and Mark,

Thank you for four useful guidance.  I have taken as a starting point the 
example in "dm/tutorials/swarm_ex3.c" to build a first approximation for domain 
decomposition in my molecular dynamics code (diffusive molecular dynamic to be 
more precise :-) ). And I must say that I am very happy with the result. 
However, in my journey integrating domain decomposition into my code, I am 
facing some new (and expected) problems.  The first is in the implementation of 
the nearest neighbor algorithm (list of atoms closest to each atom).

Can you help me understand this? For a given atom, there should be a single 
"closest" atom (barring
degeneracies in distance). What do you mean by the list of closest atoms?

  Thanks,

 Matt

My current approach to the problem is a brute force algorithm (double loop 
through the list of atoms and calculate the distance). However, it seems that 
if I call the "neighbours" function after the "DMSwarmMigrate" function the 
search algorithm does not work correctly. My thoughts / hints are:

  *   The two nested for loops should be done on the global indexing of the 
atoms instead of the local one (I don't know how to get this number).
  *   If I print the mean position of atom #0 (for example) each range prints a 
different value of the average position. One of them is the correct position 
corresponding to site #0, the others are different (but identically labeled) 
atomic sites. Which means that the site_i index is not bijective.

I believe that solving this problem will increase my understanding of the 
domain decomposition approach and may allow me to fix the remaining parts of my 
code.

Any additional comments are greatly appreciated. For instance, I will be happy 
to be pointed to any piece of code (petsc examples for example) with solves a 
similar problem in order to self-learn learn by example.

Many thanks in advance.

Best,
Miguel

This is the piece of code (simplified) which computes the list of neighbours 
for each atomic site. DMD is a structure which contains the atomistic 
information (DMSWARM), and the background mesh and bounding cell (DMDA and 
DMShell)

int neighbours(DMD* Simulation) {

PetscFunctionBegin;
PetscCallMPI(MPI_Comm_rank(PETSC_COMM_WORLD, ));

PetscCall(DMSwarmGetLocalSize(Simulation->atomistic_data, _atoms_local));

//! Get array with the mean position of the atoms
DMSwarmGetField(Simulation->atomistic_data, DMSwarmPICField_coor, , 
NULL,
(void**)_q_ptr);
Eigen::Map mean_q(mean_q_ptr, n_atoms_local, dim);

int* neigh = Simulation->neigh;
int* numneigh = Simulation->numneigh;

for (unsigned int site_i = 0; site_i < n_atoms_local; site_i++) {

//! Get mean position of site i
Eigen::Vector3d mean_q_i = mean_q.block<1, 3>(site_i, 0);

//! Search neighbourhs in the main cell (+ periodic cells)
for (unsigned site_j = 0; site_j < n_atoms_local; site_j++) {
if (site_i != site_j) {
//! Get mean position of site j in the periodic box
Eigen::Vector3d mean_q_j = mean_q.block<1, 3>(site_j, 0);

//! Check is site j is the neibourhood of the site i
double norm_r_ij = (mean_q_i - mean_q_j).norm();
if ((norm_r_ij <= r_cutoff_ADP) && (numneigh[site_i] < maxneigh)) {
neigh[site_i * maxneigh + numneigh[site_i]] = site_j;
numneigh[site_i] += 1;
}
}
}

} // MPI for loop (site_i)

DMSwarmRestoreField(Simulation->atomistic_data, DMSwarmPICField_coor, 
,
NULL, (void**)_q_ptr);

return EXIT_SUCCESS;
}


This is the piece of code that I use to read the atomic positions (mean_q) from 
a file:
//! @brief mean_q: Mean value of each atomic position
double* mean_q;
PetscCall(DMSwarmGetField(atomistic_data, DMSwarmPICField_coor, ,
NULL, (void**)_q));

cnt = 0;
for (PetscInt site_i = 0; site_i < n_atoms_local; site_i++) {
if (cnt 

Re: [petsc-users] Domain decomposition in PETSc for Molecular Dynamics

2023-11-04 Thread MIGUEL MOLINOS PEREZ
Thank you Mark! I will have a look to it.

Best,
Miguel


On 4 Nov 2023, at 13:54, Matthew Knepley  wrote:


On Sat, Nov 4, 2023 at 8:40 AM Mark Adams 
mailto:mfad...@lbl.gov>> wrote:
Hi MIGUEL,

This might be a good place to start: https://petsc.org/main/manual/vec/
Feel free to ask more specific questions, but the docs are a good place to 
start.

Thanks,
Mark

On Fri, Nov 3, 2023 at 5:19 AM MIGUEL MOLINOS PEREZ 
mailto:mmoli...@us.es>> wrote:
Dear all,

I am currently working on the development of a in-house molecular dynamics code 
using PETSc and C++. So far the code works great, however it is a little bit 
slow since I am not exploiting MPI for PETSc vectors. I was wondering if there 
is a way to perform the domain decomposition efficiently using some PETSc 
functionality. Any feedback is highly appreciated.

It sounds like you mean "is there a way to specify a communication construct 
that can send my particle
information automatically". We use PetscSF for that. You can see how this works 
with the DMSwarm class, which represents a particle discretization. You can 
either use that, or if it does not work for you, do the same things with your 
class.

  Thanks,

 Matt

Best regards,
Miguel


--
What most experimenters take for granted before they begin their experiments is 
infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://www.cse.buffalo.edu/~knepley/<http://www.cse.buffalo.edu/~knepley/>


[petsc-users] Domain decomposition in PETSc for Molecular Dynamics

2023-11-03 Thread MIGUEL MOLINOS PEREZ
Dear all, 

I am currently working on the development of a in-house molecular dynamics code 
using PETSc and C++. So far the code works great, however it is a little bit 
slow since I am not exploiting MPI for PETSc vectors. I was wondering if there 
is a way to perform the domain decomposition efficiently using some PETSc 
functionality. Any feedback is highly appreciated.

Best regards,
Miguel