Re: [deal.II] Re: MPI, synchronize processes

2022-08-22 Thread Wolfgang Bangerth
On 8/22/22 09:55, Uclus Heis wrote: Would be also a poddible solution to export my testvec as it is right now (which contains the global solution) but instead of exporting with all the preocess, call the print function only for one process? Yes. But that runs again into the same issue

Re: [deal.II] Re: MPI, synchronize processes

2022-08-22 Thread Uclus Heis
Dear Wolfgang, Thank you very much for the suggestion. Would be also a poddible solution to export my testvec as it is right now (which contains the global solution) but instead of exporting with all the preocess, call the print function only for one process? Thank you El El lun, 22 ago 2022 a

Re: [deal.II] Re: MPI, synchronize processes

2022-08-22 Thread Wolfgang Bangerth
On 8/21/22 04:29, Uclus Heis wrote: // /testvec.print(outloop,9,true,false);/ It is clear that the problem I have now is that I am exporting the completely_distributed_solution and that is not what I want. Could you please informe me how to obtain the locally own solution? I can not find the

Re: [deal.II] Re: MPI, synchronize processes

2022-08-21 Thread Uclus Heis
Dear Wolfgang, Thank you for the clarifications. I am trying now to export a file per process (and frequency) to avoid the issue that I had (previously mentioned). However, What I get is a vector with the total dof instead of the locally own dof. My solver function is *

Re: [deal.II] Re: MPI, synchronize processes

2022-08-19 Thread Wolfgang Bangerth
On 8/19/22 14:25, Uclus Heis wrote: "/That said, from your code, it looks like all processes are opening the same/ /file and writing to it. Nothing good will come of this. There is of course also the issue that importing all vector elements to one process cannot scale to large numbers of

Re: [deal.II] Re: MPI, synchronize processes

2022-08-19 Thread Uclus Heis
Dear Wolfgang, Thank you very much for your answer. Regarding what you mentioned: "*That said, from your code, it looks like all processes are opening the same* *file and writing to it. Nothing good will come of this. There is of coursealso the issue that importing all vector elements to

Re: [deal.II] Re: MPI, synchronize processes

2022-08-19 Thread Wolfgang Bangerth
On 8/19/22 03:25, Uclus Heis wrote: / / The way of extracting and exporting the solution with /testvec=locally_relevant_solution / is a bad practice? I am saving the locally relevant solution from many different processes in one single file for a given frequency. I am afraid that there is no

Re: [deal.II] Re: MPI, synchronize processes

2022-08-19 Thread Uclus Heis
Dear all, after some time I came back to this problem again. I would kindly ask for some guidance to see if I can understand and solve the issue. I am using a parallel::distributed::Triangulation with MPI. I call the function solve() in a loop for different frequencies and want to export the

Re: [deal.II] Re: MPI, synchronize processes

2022-02-17 Thread Wolfgang Bangerth
On 2/17/22 09:22, Uclus Heis wrote: I still had problems as I first copy the array and then I store it in a matrix for diffeerent frequencies.The result I got was differet whene using few process compared to using 1 single process. I added the following code and now works, is it right? It